Back to Search Start Over

PanGu-Bot: Efficient Generative Dialogue Pre-training from Pre-trained Language Model

Authors :
Mi, Fei
Li, Yitong
Zeng, Yulong
Zhou, Jingyan
Wang, Yasheng
Xu, Chuanfei
Shang, Lifeng
Jiang, Xin
Zhao, Shiqi
Liu, Qun
Publication Year :
2022

Abstract

In this paper, we introduce PanGu-Bot, a Chinese pre-trained open-domain dialogue generation model based on a large pre-trained language model (PLM) PANGU-alpha (Zeng et al.,2021). Different from other pre-trained dialogue models trained over a massive amount of dialogue data from scratch, we aim to build a powerful dialogue model with relatively fewer data and computation costs by inheriting valuable language capabilities and knowledge from PLMs. To this end, we train PanGu-Bot from the large PLM PANGU-alpha, which has been proven well-performed on a variety of Chinese natural language tasks. We investigate different aspects of responses generated by PanGu-Bot, including response quality, knowledge, and safety. We show that PanGu-Bot outperforms state-of-the-art Chinese dialogue systems (CDIALGPT (Wang et al., 2020), EVA (Zhou et al., 2021), EVA2.0 (Gu et al., 2022)) w.r.t. the above three aspects. We also demonstrate that PanGu-Bot can be easily deployed to generate emotional responses without further training. Throughout our empirical analysis, we also point out that the PanGu-Bot response quality, knowledge correctness, and safety are still far from perfect, and further explorations are indispensable to building reliable and smart dialogue systems. Our model and code will be available at https://github.com/huawei-noah/Pretrained-Language-Model/tree/master/PanGu-Bot soon.<br />Comment: Update model and results; add comparison with EVA2.0

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2203.17090
Document Type :
Working Paper