Back to Search Start Over

Can Monolingual Pretrained Models Help Cross-Lingual Classification?

Authors :
Chi, Zewen
Dong, Li
Wei, Furu
Mao, Xian-Ling
Huang, Heyan
Publication Year :
2019

Abstract

Multilingual pretrained language models (such as multilingual BERT) have achieved impressive results for cross-lingual transfer. However, due to the constant model capacity, multilingual pre-training usually lags behind the monolingual competitors. In this work, we present two approaches to improve zero-shot cross-lingual classification, by transferring the knowledge from monolingual pretrained models to multilingual ones. Experimental results on two cross-lingual classification benchmarks show that our methods outperform vanilla multilingual fine-tuning.<br />5 pages

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....b59a59334d5052f1364a5b449f098227