Back to Search Start Over

Cross-Lingual Language Model Meta-Pretraining

Authors :
Chi, Zewen
Huang, Heyan
Liu, Luyang
Bai, Yu
Mao, Xian-Ling
Publication Year :
2021

Abstract

The success of pretrained cross-lingual language models relies on two essential abilities, i.e., generalization ability for learning downstream tasks in a source language, and cross-lingual transferability for transferring the task knowledge to other languages. However, current methods jointly learn the two abilities in a single-phase cross-lingual pretraining process, resulting in a trade-off between generalization and cross-lingual transfer. In this paper, we propose cross-lingual language model meta-pretraining, which learns the two abilities in different training phases. Our method introduces an additional meta-pretraining phase before cross-lingual pretraining, where the model learns generalization ability on a large-scale monolingual corpus. Then, the model focuses on learning cross-lingual transfer on a multilingual corpus. Experimental results show that our method improves both generalization and cross-lingual transfer, and produces better-aligned representations across different languages.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2109.11129
Document Type :
Working Paper