Back to Search Start Over

Multi-Stage Pre-training Enhanced by ChatGPT for Multi-Scenario Multi-Domain Dialogue Summarization

Authors :
Zhou, Weixiao
Li, Gengyao
Cheng, Xianfu
Liang, Xinnian
Zhu, Junnan
Zhai, Feifei
Li, Zhoujun
Publication Year :
2023

Abstract

Dialogue summarization involves a wide range of scenarios and domains. However, existing methods generally only apply to specific scenarios or domains. In this study, we propose a new pre-trained model specifically designed for multi-scenario multi-domain dialogue summarization. It adopts a multi-stage pre-training strategy to reduce the gap between the pre-training objective and fine-tuning objective. Specifically, we first conduct domain-aware pre-training using large-scale multi-scenario multi-domain dialogue data to enhance the adaptability of our pre-trained model. Then, we conduct task-oriented pre-training using large-scale multi-scenario multi-domain "dialogue-summary" parallel data annotated by ChatGPT to enhance the dialogue summarization ability of our pre-trained model. Experimental results on three dialogue summarization datasets from different scenarios and domains indicate that our pre-trained model significantly outperforms previous state-of-the-art models in full fine-tuning, zero-shot, and few-shot settings.<br />Comment: Accepted to EMNLP 2023 findings

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.10285
Document Type :
Working Paper