Back to Search Start Over

HLT-MT: High-resource Language-specific Training for Multilingual Neural Machine Translation

Authors :
Yang, Jian
Yin, Yuwei
Ma, Shuming
Zhang, Dongdong
Li, Zhoujun
Wei, Furu
Publication Year :
2022

Abstract

Multilingual neural machine translation (MNMT) trained in multiple language pairs has attracted considerable attention due to fewer model parameters and lower training costs by sharing knowledge among multiple languages. Nonetheless, multilingual training is plagued by language interference degeneration in shared parameters because of the negative interference among different translation directions, especially on high-resource languages. In this paper, we propose the multilingual translation model with the high-resource language-specific training (HLT-MT) to alleviate the negative interference, which adopts the two-stage training with the language-specific selection mechanism. Specifically, we first train the multilingual model only with the high-resource pairs and select the language-specific modules at the top of the decoder to enhance the translation quality of high-resource directions. Next, the model is further trained on all available corpora to transfer knowledge from high-resource languages (HRLs) to low-resource languages (LRLs). Experimental results show that HLT-MT outperforms various strong baselines on WMT-10 and OPUS-100 benchmarks. Furthermore, the analytic experiments validate the effectiveness of our method in mitigating the negative interference in multilingual training.<br />Comment: 7 pages, 7 figures, IJCAI-ECAI 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2207.04906
Document Type :
Working Paper
Full Text :
https://doi.org/10.24963/ijcai.2022/619