Back to Search Start Over

Improving Non-Autoregressive Machine Translation Using Sentence-Level Semantic Agreement.

Authors :
Wang, Shuheng
Huang, Heyan
Shi, Shumin
Source :
Applied Sciences (2076-3417); May2022, Vol. 12 Issue 10, p5003, 12p
Publication Year :
2022

Abstract

Theinference stage can be accelerated significantly using a Non-Autoregressive Transformer (NAT). However, the training objective used in the NAT model also aims to minimize the loss between the generated words and the golden words in the reference. Since the dependencies between the target words are lacking, this training objective computed at word level can easily cause semantic inconsistency between the generated and source sentences. To alleviate this issue, we propose a new method, Sentence-Level Semantic Agreement (SLSA), to obtain consistency between the source and generated sentences. Specifically, we utilize contrastive learning to pull the sentence representations of the source and generated sentences closer together. In addition, to strengthen the capability of the encoder, we also integrate an agreement module into the encoder to obtain a better representation of the source sentence. The experiments are conducted on three translation datasets: the WMT 2014 EN → DE task, the WMT 2016 EN → RO task, and the IWSLT 2014 DE → DE task, and the improvement in the NAT model's performance shows the effect of our proposed method. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
MACHINE translating

Details

Language :
English
ISSN :
20763417
Volume :
12
Issue :
10
Database :
Complementary Index
Journal :
Applied Sciences (2076-3417)
Publication Type :
Academic Journal
Accession number :
157129616
Full Text :
https://doi.org/10.3390/app12105003