PROJECT TITLE :
A Two-Stage Transformer-Based Approach for Variable-Length Abstractive Summarization
For variable-length abstractive summarization, this paper suggests a two-stage technique. The suggested approach is superior to earlier models in that it can provide fluent and variable-length abstractive summarization at the same time. A text segmentation module and a two-stage Transformer-based summarizing module make up the proposed abstractive summarization paradigm.
To begin, the text segmentation module divides the input text into segments using a pre-trained Bidirectional Encoder Representations from Transformers (BERT) and a bidirectional long short-term memory (LSTM). The most important sentence from each segment is then extracted using an extractive model based on the BERT-based summarization model (BERTSUM). The extracted sentences are utilized to train the document summarization module in the second step of the two-stage summarization model.
The segments are then utilized to train the segment summarization module in the first stage by taking into account both the segment summarization module's outputs and the pre-trained second-stage document summarization module's outputs. The settings of the segment summarization module are updated using the document summarization module's loss scores as well as the segment summarization module's loss scores.
Finally, until convergence, collaborative training is used to alternately train the segment summarization and document summarization modules. The segment summarization module's outputs are concatenated to produce a variable-length abstractive summarization result for testing. The BERT-biLSTM-based text segmentation model is tested using the ChWiki 181k database, and it shows promising results in capturing the relationship between sentences.
Finally, in human subjective evaluation of the LCSTS dataset, the suggested variable-length abstractive summarization system attained a maximum of 70.0 percent accuracy.
Did you like this research project?
To get this research project Guidelines, Training and Code... Click Here