Back to Search Start Over

Fed-ZERO: Efficient Zero-shot Personalization with Federated Mixture of Experts

Authors :
Dun, Chen
Garcia, Mirian Hipolito
Zheng, Guoqing
Awadallah, Ahmed Hassan
Sim, Robert
Kyrillidis, Anastasios
Dimitriadis, Dimitrios
Publication Year :
2023

Abstract

One of the goals in Federated Learning (FL) is to create personalized models that can adapt to the context of each participating client, while utilizing knowledge from a shared global model. Yet, often, personalization requires a fine-tuning step using clients' labeled data in order to achieve good performance. This may not be feasible in scenarios where incoming clients are fresh and/or have privacy concerns. It, then, remains open how one can achieve zero-shot personalization in these scenarios. We propose a novel solution by using a Mixture-of-Experts (MoE) framework within a FL setup. Our method leverages the diversity of the clients to train specialized experts on different subsets of classes, and a gating function to route the input to the most relevant expert(s). Our gating function harnesses the knowledge of a pretrained model common expert to enhance its routing decisions on-the-fly. As a highlight, our approach can improve accuracy up to 18\% in state of the art FL settings, while maintaining competitive zero-shot performance. In practice, our method can handle non-homogeneous data distributions, scale more efficiently, and improve the state-of-the-art performance on common FL benchmarks.<br />14 Pages

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....e52e32c8bfe1d3f69750e132970ccbce