Back to Search Start Over

Non-IID Federated Learning With Sharper Risk Bound

Authors :
Wei, Bojian
Li, Jian
Liu, Yong
Wang, Weiping
Source :
IEEE Transactions on Neural Networks and Learning Systems; 2024, Vol. 35 Issue: 5 p6906-6917, 12p
Publication Year :
2024

Abstract

In federated learning (FL), the not independently or identically distributed (non-IID) data partitioning impairs the performance of the global model, which is a severe problem to be solved. Despite the extensive literature related to the algorithmic novelties and optimization analysis of FL, there has been relatively little theoretical research devoted to studying the generalization performance of non-IID FL. The generalization research of non-IID FL still lack effective tools and analytical approach. In this article, we propose weighted local Rademacher complexity to pertinently analyze the generalization properties of non-IID FL and derive a sharper excess risk bound based on weighted local Rademacher complexity, where the convergence rate is much faster than the existing bounds. Based on the theoretical results, we present a general framework federated averaging with local rademacher complexity (FedALRC) to lower the excess risk without additional communication costs compared to some famous methods, such as FedAvg. Through extensive experiments, we show that FedALRC outperforms FedAvg, FedProx and FedNova, and those experimental results coincide with our theoretical findings.

Details

Language :
English
ISSN :
2162237x and 21622388
Volume :
35
Issue :
5
Database :
Supplemental Index
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Publication Type :
Periodical
Accession number :
ejs66332047
Full Text :
https://doi.org/10.1109/TNNLS.2022.3213187