Back to Search Start Over

SEASum: Syntax-Enriched Abstractive Summarization.

Authors :
Liu, Sen
Yang, Libin
Cai, Xiaoyan
Source :
Expert Systems with Applications. Aug2022, Vol. 199, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

Compared to traditional RNN-based models, abstractive summarization systems based on Pre-trained Language Models (PTMs) achieve dramatic improvements in readability. Thus, in the field of abstractive summarization, more attention should be devoted to the faithfulness issue that predicted summaries are not factually consistent with source texts. To alleviate this disadvantage, we propose a novel S yntax- E nriched A bstractive Sum marization (SEASum) framework, which utilizes graph attention networks (GATs) to introduce syntactic features of source texts to generate faithful summaries. In the SEASum framework, the PTM-based semantic encoder encodes word sequence, while the GAT-based syntactic encoder captures explicit syntax, i.e., part-of-speech tags, parse trees, and dependency-based relative positions of source documents. A feature fusion module is introduced to incorporate encoded syntactic features into the summarization framework. Based on the proposed SEASum framework, we develop two summarization models: 1) parallel SEASum model, in which the semantic encoder and syntactic encoder work in parallel, a multi-head attention module fused two-stream features for the following decoding process; 2) cascaded SEASum model, which takes contextual word embeddings from semantic encoder as node embeddings for the syntactic encoder and employs highway networks to regulate information flow. Experimental results on CNN/DailyMail and Reddit-TIFU (short) datasets show our parallel SEASum model and cascaded SEASum model outperform state-of-the-art abstractive summarization approaches in the faithfulness measurement. The results also demonstrate that cascaded SEASum model is more effective than parallel SEASum model in boosting faithfulness. • A syntax-enhanced abstractive summarization framework, SEASum, is proposed. • SEASum incorporates syntactic features, i.e. dependency parse trees, POS tags, etc. • A parallel and a cascaded model are developed based on the SEASum framework. • Two SEASum-based models are tested on CNN/DM and Reddit-TIFU (short) datasets. • Both SEASum models achieve improvements on ROUGE and faithfulness metrics. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
199
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
156552262
Full Text :
https://doi.org/10.1016/j.eswa.2022.116819