Back to Search
Start Over
Self-supervised attention flow for dialogue state tracking.
- Source :
-
Neurocomputing . Jun2021, Vol. 440, p279-286. 8p. - Publication Year :
- 2021
-
Abstract
- The performance of existing approaches for dialogue state tracking (DST) is often limited by the deficiency of labeled datasets, and inefficient utilization of data is also a practical yet tough problem of the DST task. In this paper, we aim to tackle these challenges in a self-supervised manner by introducing an auxiliary pre-training task that learns to pick up the correct dialogue response from a group of candidates. Moreover, we propose an attention flow mechanism that is augmented with a soft-threshold function in a dynamic way to better understand the user intent and filter out the redundant information. Extensive experiments on the multi-domain dialogue state tracking dataset MultiWOZ 2.1 demonstrate the effectiveness of our proposed method, and we also show that it is able to adapt to zero/few-shot cases under the proposed self-supervised framework. [ABSTRACT FROM AUTHOR]
- Subjects :
- *TASKS
Subjects
Details
- Language :
- English
- ISSN :
- 09252312
- Volume :
- 440
- Database :
- Academic Search Index
- Journal :
- Neurocomputing
- Publication Type :
- Academic Journal
- Accession number :
- 149919665
- Full Text :
- https://doi.org/10.1016/j.neucom.2021.01.118