Back to Search Start Over

A Novel LLM-based Two-stage Summarization Approach for Long Dialogues

Authors :
Yin, Yuan-Jhe
Chen, Bo-Yu
Chen, Berlin
Publication Year :
2024

Abstract

Long document summarization poses a significant challenge in natural language processing due to input lengths that exceed the capacity of most state-of-the-art pre-trained language models. This study proposes a hierarchical framework that segments and condenses information from long documents, subsequently fine-tuning the processed text with an abstractive summarization model. Unsupervised topic segmentation methods identify semantically appropriate breakpoints. The condensation stage utilizes an unsupervised generation model to generate condensed data, and our current experiments employ ChatGPT(v3.5). The summarization stage fine-tunes the abstractive summarization model on the condensed data to generate the final results. This framework enables long documents to be processed on models even when the document length exceeds the model's maximum input size. The exclusion of the entire document from the summarization model reduces the time and computational resources required for training, making the framework suitable for contexts with constrained local computational resources.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2410.06520
Document Type :
Working Paper