Back to Search Start Over

Robust Federated Learning With Noisy Communication.

Authors :
Ang, Fan
Chen, Li
Zhao, Nan
Chen, Yunfei
Wang, Weidong
Yu, F. Richard
Source :
IEEE Transactions on Communications. Jun2020, Vol. 68 Issue 6, p3452-3464. 13p.
Publication Year :
2020

Abstract

Federated learning is a communication-efficient training process that alternate between local training at the edge devices and averaging of the updated local model at the center server. Nevertheless, it is impractical to achieve perfect acquisition of the local models in wireless communication due to the noise, which also brings serious effect on federated learning. To tackle this challenge in this paper, we propose a robust design for federated learning to decline the effect of noise. Considering the noise in two aforementioned steps, we first formulate the training problem as a parallel optimization for each node under the expectation-based model and worst-case model. Due to the non-convexity of the problem, regularizer approximation method is proposed to make it tractable. Regarding the worst-case model, we utilize the sampling-based successive convex approximation algorithm to develop a feasible training scheme to tackle the unavailable maxima or minima noise condition and the non-convex issue of the objective function. Furthermore, the convergence rates of both new designs are analyzed from a theoretical point of view. Finally, the improvement of prediction accuracy and the reduction of loss function value are demonstrated via simulation for the proposed designs. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00906778
Volume :
68
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Communications
Publication Type :
Academic Journal
Accession number :
143858231
Full Text :
https://doi.org/10.1109/TCOMM.2020.2979149