Back to Search Start Over

Enhancing Dialogue Summarization with Topic-Aware Global- and Local- Level Centrality

Authors :
Liang, Xinnian
Wu, Shuangzhi
Cui, Chenhao
Bai, Jiaqi
Bian, Chao
Li, Zhoujun
Publication Year :
2023

Abstract

Dialogue summarization aims to condense a given dialogue into a simple and focused summary text. Typically, both the roles' viewpoints and conversational topics change in the dialogue stream. Thus how to effectively handle the shifting topics and select the most salient utterance becomes one of the major challenges of this task. In this paper, we propose a novel topic-aware Global-Local Centrality (GLC) model to help select the salient context from all sub-topics. The centralities are constructed at both the global and local levels. The global one aims to identify vital sub-topics in the dialogue and the local one aims to select the most important context in each sub-topic. Specifically, the GLC collects sub-topic based on the utterance representations. And each utterance is aligned with one sub-topic. Based on the sub-topics, the GLC calculates global- and local-level centralities. Finally, we combine the two to guide the model to capture both salient context and sub-topics when generating summaries. Experimental results show that our model outperforms strong baselines on three public dialogue summarization datasets: CSDS, MC, and SAMSUM. Further analysis demonstrates that our GLC can exactly identify vital contents from sub-topics.~\footnote{\url{https://github.com/xnliang98/bart-glc}}<br />Comment: EACL 2023 Long paper

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2301.12376
Document Type :
Working Paper