Back to Search Start Over

A topic modeled unsupervised approach to single document extractive text summarization.

Authors :
Srivastava, Ridam
Singh, Prabhav
Rana, K.P.S.
Kumar, Vineet
Source :
Knowledge-Based Systems. Jun2022, Vol. 246, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

Automatic Text Summarization (ATS) is an essential field in natural language processing that attempts to condense large text documents so that users can assimilate information quickly. It finds uses in medical document summarization, review generation, and opinion mining. This work investigated an unsupervised extractive summarization approach that combined clustering with topic modeling to reduce topic bias. Latent Dirichlet Allocation was used for topic modeling, while K-Medoids clustering was employed for summary generation. The approach was evaluated on three datasets—Wikihow, CNN/DailyMail, and the DUC2002 Corpus. The Recall Oriented-Understudy for Gisting Evaluation (ROUGE) metrics were used for comparative analysis against recently reported techniques, specifically ROUGE-1 (R-1), ROUGE-2 (R-2), and ROUGE-L (R-L). The suggested framework offered scores of 34.80%, 9.13%, and 32.30% on the Wikihow Dataset, 43.90%, 19.01%, and 41.50% on the CNN/DailyMail Dataset, and 49.35%, 31.53%, and 41.72% on the DUC2002 Corpus (R-1, R-2, R-L respectively). These reported metrics are found to be superior when compared to similar recent works. Further, execution time of the proposed method was also recorded and compared with counterparts, which established its superior speed. Based on these promising outcomes, it was concluded that an unsupervised extractive summarization approach with greater subtopic focus significantly improves over generic topic modeling semantic and deep learning approaches. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09507051
Volume :
246
Database :
Academic Search Index
Journal :
Knowledge-Based Systems
Publication Type :
Academic Journal
Accession number :
156649404
Full Text :
https://doi.org/10.1016/j.knosys.2022.108636