Back to Search Start Over

Discrete and Soft Prompting for Multilingual Models

Authors :
Zhao, Mengjie
Schütze, Hinrich
Publication Year :
2021

Abstract

It has been shown for English that discrete and soft prompting perform strongly in few-shot learning with pretrained language models (PLMs). In this paper, we show that discrete and soft prompting perform better than finetuning in multilingual cases: Crosslingual transfer and in-language training of multilingual natural language inference. For example, with 48 English training examples, finetuning obtains 33.74% accuracy in crosslingual transfer, barely surpassing the majority baseline (33.33%). In contrast, discrete and soft prompting outperform finetuning, achieving 36.43% and 38.79%. We also demonstrate good performance of prompting with training data in multiple languages other than English.<br />Comment: EMNLP 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2109.03630
Document Type :
Working Paper