Back to Search Start Over

Improving faithfulness in abstractive text summarization with EDUs using BART

Authors :
Chali, Yllias
Delpisheh, Narjes
University of Lethbridge. Faculty of Arts and Science
Chali, Yllias
Delpisheh, Narjes
University of Lethbridge. Faculty of Arts and Science
Publication Year :
2023

Abstract

ive summarization aims to reproduce the essential information of a source document in a summary by using the summarizer's own words. Although this approach is more similar to how humans summarize, it is more challenging to automate as it requires a complete understanding of natural language. However, the development of deep learning approaches, such as the sequence-to-sequence model with an attention-based mechanism, and the availability of pre-trained language models have led to improved performance in summarization systems. Nonetheless, abstractive summarization still suffers from issues such as hallucination and unfaithfulness. To address these issues, we propose an approach that utilizes a guidance signal using important Elementary Discourse Units (EDUs). We compare our work with previous guided summarization and two other summarization models that enhanced the faithfulness of the summary. Our approach was tested on CNN/Daily Mail dataset, and results showed an improvement in both truthfulness and good quantity coverage of the source document.

Details

Database :
OAIster
Notes :
en_US
Publication Type :
Electronic Resource
Accession number :
edsoai.on1416889410
Document Type :
Electronic Resource