Back to Search Start Over

Pretrained Models for Multilingual Federated Learning

Authors :
Weller, Orion
Marone, Marc
Braverman, Vladimir
Lawrie, Dawn
Van Durme, Benjamin
Publication Year :
2022

Abstract

Since the advent of Federated Learning (FL), research has applied these methods to natural language processing (NLP) tasks. Despite a plethora of papers in FL for NLP, no previous works have studied how multilingual text impacts FL algorithms. Furthermore, multilingual text provides an interesting avenue to examine the impact of non-IID text (e.g. different languages) on FL in naturally occurring data. We explore three multilingual language tasks, language modeling, machine translation, and text classification using differing federated and non-federated learning algorithms. Our results show that using pretrained models reduces the negative effects of FL, helping them to perform near or better than centralized (no privacy) learning, even when using non-IID partitioning.<br />Comment: NAACL 2022

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2206.02291
Document Type :
Working Paper