Back to Search Start Over

MoPE: Mixture of Prefix Experts for Zero-Shot Dialogue State Tracking

Authors :
Tang, Tianwen
Zhu, Tong
Liu, Haodong
Bai, Yin
Cheng, Jia
Chen, Wenliang
Publication Year :
2024

Abstract

Zero-shot dialogue state tracking (DST) transfers knowledge to unseen domains, reducing the cost of annotating new datasets. Previous zero-shot DST models mainly suffer from domain transferring and partial prediction problems. To address these challenges, we propose Mixture of Prefix Experts (MoPE) to establish connections between similar slots in different domains, which strengthens the model transfer performance in unseen domains. Empirical results demonstrate that MoPE-DST achieves the joint goal accuracy of 57.13% on MultiWOZ2.1 and 55.40% on SGD.<br />Comment: Accepted to LREC-COLING 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2404.08559
Document Type :
Working Paper