Back to Search Start Over

An attentive neural network based on deep learning approach for abstractive text summarization.

Authors :
Sapra, Shruti J.
Thakur, Shruti
Kapse, Avinash S.
Atique, Mohammad
Source :
AIP Conference Proceedings; 2024, Vol. 3214 Issue 1, p1-7, 7p
Publication Year :
2024

Abstract

In recent times, abstract text summarization has made great strides by moving away from linear models based on sparse and manually-crafted features and towards nonlinear neural network models that take use of rich inputs. Deep learning models have proven effective in NLP applications because they can model complex data patterns without the need for human-created features. Learning sequentially has acknowledged a lot of consideration in the last several years. Because entire training of encoder-decoder neural systems in tasks like machine translation has proven successful, research using similar architectures in other transduction tasks, such as paraphrase creation or abstractive summarization, has developed. In this study, we provide a neural network based abstractive text summarizer. The attention method was implemented to fix the issue of processing lengthy sequences of input text. A dataset including two news summaries was used to train the model. The goal of the approach is to make use of a mechanism for directing focus at the sentence level to help with focus at the word level. The results are superior than those of competing models in the research. The experimental findings on both datasets demonstrate that the provided model efficiently increases ROUGE scores and provides a more concise summary of the original document without losing any key details. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0094243X
Volume :
3214
Issue :
1
Database :
Complementary Index
Journal :
AIP Conference Proceedings
Publication Type :
Conference
Accession number :
180650687
Full Text :
https://doi.org/10.1063/5.0239206