Back to Search Start Over

Flexible Clustered Federated Learning for Client-Level Data Distribution Shift.

Authors :
Duan, Moming
Liu, Duo
Ji, Xinyuan
Wu, Yu
Liang, Liang
Chen, Xianzhang
Tan, Yujuan
Ren, Ao
Source :
IEEE Transactions on Parallel & Distributed Systems; No2022, Vol. 33 Issue 11, p2661-2674, 14p
Publication Year :
2022

Abstract

Federated Learning (FL) enables the multiple participating devices to collaboratively contribute to a global neural network model while keeping the training data locally. Unlike the centralized training setting, the non-IID, imbalanced (statistical heterogeneity) and distribution shifted training data of FL is distributed in the federated network, which will increase the divergences between the local models and the global model, further degrading performance. In this paper, we propose a flexible clustered federated learning (CFL) framework named FlexCFL, in which we 1) group the training of clients based on the similarities between the clients’ optimization directions for lower training divergence; 2) implement an efficient newcomer device cold start mechanism for framework scalability and practicality; 3) flexibly migrate clients to meet the challenge of client-level data distribution shift. FlexCFL can achieve improvements by dividing joint optimization into groups of sub-optimization and can strike a balance between accuracy and communication efficiency in the distribution shift environment. The convergence and complexity are analyzed to demonstrate the efficiency of FlexCFL. We also evaluate FlexCFL on several open datasets and made comparisons with related CFL frameworks. The results show that FlexCFL can significantly improve absolute test accuracy by $+10.6\%$ + 10. 6 % on FEMNIST compared with FedAvg, $+3.5\%$ + 3. 5 % on FashionMNIST compared with FedProx, $+8.4\%$ + 8. 4 % on MNIST compared with FeSEM, $+4.7\%$ + 4. 7 % on Sentiment140 compare with IFCA. The experiment results show that FlexCFL is also communication efficient in the distribution shift environment. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10459219
Volume :
33
Issue :
11
Database :
Complementary Index
Journal :
IEEE Transactions on Parallel & Distributed Systems
Publication Type :
Academic Journal
Accession number :
157073371
Full Text :
https://doi.org/10.1109/TPDS.2021.3134263