Back to Search Start Over

Self-supervised and Supervised Joint Training for Resource-rich Machine Translation

Authors :
Cheng, Yong
Wang, Wei
Jiang, Lu
Macherey, Wolfgang
Publication Year :
2021

Abstract

Self-supervised pre-training of text representations has been successfully applied to low-resource Neural Machine Translation (NMT). However, it usually fails to achieve notable gains on resource-rich NMT. In this paper, we propose a joint training approach, $F_2$-XEnDec, to combine self-supervised and supervised learning to optimize NMT models. To exploit complementary self-supervised signals for supervised learning, NMT models are trained on examples that are interbred from monolingual and parallel sentences through a new process called crossover encoder-decoder. Experiments on two resource-rich translation benchmarks, WMT'14 English-German and WMT'14 English-French, demonstrate that our approach achieves substantial improvements over several strong baseline methods and obtains a new state of the art of 46.19 BLEU on English-French when incorporating back translation. Results also show that our approach is capable of improving model robustness to input perturbations such as code-switching noise which frequently appears on social media.<br />Comment: Accepted by ICML 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2106.04060
Document Type :
Working Paper