1. Abstractive Summarization Model with a Feature-Enhanced Seq2Seq Structure
- Author
-
Tao Xie, Bin Xue, Jingzhou Ji, and Zepeng Hao
- Subjects
Structure (mathematical logic) ,business.industry ,Computer science ,Deep learning ,computer.software_genre ,Automatic summarization ,Electronic mail ,Data modeling ,Feature (machine learning) ,Task analysis ,Artificial intelligence ,business ,computer ,Encoder ,Natural language processing - Abstract
ive text summarization task is mainly through deep learning method to summarize one or more documents to produce a concise summary that can express the main meaning of the document. Most methods are mainly based on the traditional Seq2Seq structure, but the traditional Seq2Seq structure has limited ability to capture and store long-term features and global features, resulting in a lack of information in the generated summary. In our paper, we put forward a new abstractive summarization model based on feature-enhanced Seq2Seq structure for single document summarization task. This model utilizes two types of feature capture networks to improve the encoder and decoder in traditional Seq2Seq structure, to enhance the model’s ability to capture and store long-term features and global features, so that the generated summary more informative and more fluency. Finally, we verified the model we proposed on the CNN/DailyMail dataset. Experimental results demonstrate that the model proposed in this paper is more effective than the baseline model, and has improved by 5.6%, 5.3%, 6.2% on the three metrics R-1, R-2, and R-L.
- Published
- 2020
- Full Text
- View/download PDF