Back to Search Start Over

Heterogeneous Ensemble Knowledge Transfer for Training Large Models in Federated Learning

Authors :
Cho, Yae Jee
Manoel, Andre
Joshi, Gauri
Sim, Robert
Dimitriadis, Dimitrios
Publication Year :
2022

Abstract

Federated learning (FL) enables edge-devices to collaboratively learn a model without disclosing their private data to a central aggregating server. Most existing FL algorithms require models of identical architecture to be deployed across the clients and server, making it infeasible to train large models due to clients' limited system resources. In this work, we propose a novel ensemble knowledge transfer method named Fed-ET in which small models (different in architecture) are trained on clients, and used to train a larger model at the server. Unlike in conventional ensemble learning, in FL the ensemble can be trained on clients' highly heterogeneous data. Cognizant of this property, Fed-ET uses a weighted consensus distillation scheme with diversity regularization that efficiently extracts reliable consensus from the ensemble while improving generalization by exploiting the diversity within the ensemble. We show the generalization bound for the ensemble of weighted models trained on heterogeneous datasets that supports the intuition of Fed-ET. Our experiments on image and language tasks show that Fed-ET significantly outperforms other state-of-the-art FL algorithms with fewer communicated parameters, and is also robust against high data-heterogeneity.<br />Comment: To appear in the proceedings of the 31st International Joint Conference on Artificial Intelligence (IJCAI 2022)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2204.12703
Document Type :
Working Paper