Back to Search Start Over

Accelerating Wireless Federated Learning via Nesterov’s Momentum and Distributed Principal Component Analysis

Authors :
Dong, Yanjie
Wang, Luya
Wang, Jia
Hu, Xiping
Zhang, Haijun
Yu, Fei Richard
Leung, Victor C. M.
Source :
IEEE Transactions on Wireless Communications; 2024, Vol. 23 Issue: 6 p5938-5952, 15p
Publication Year :
2024

Abstract

A wireless federated learning system is investigated by allowing a server and multiple workers to exchange uncoded information via orthogonal wireless channels. Since the workers frequently upload local gradients to the server via band-limited channels, the uplink transmission from the workers to the server becomes a communication bottleneck. Therefore, a one-shot distributed principal component analysis (PCA) is leveraged to reduce the dimension of uploaded gradients to relieve the communication bottleneck. A PCA-based wireless federated learning (PCA-WFL) algorithm and its accelerated version (i.e., PCA-AWFL) are proposed based on the low-dimensional gradients and the Nesterov’s momentum. For the non-convex empirical risk, a finite-time analysis is performed to quantify the impacts of system hyper-parameters on the convergence of the PCA-WFL and PCA-AWFL algorithms. The PCA-AWFL algorithm is theoretically certified to converge faster than the PCA-WFL algorithm. Besides, the convergence rates of PCA-WFL and PCA-AWFL algorithms quantitatively reveal the linear speedup with respect to the number of workers over the vanilla gradient descent algorithm. Numerical results are used to demonstrate the improved convergence rates of the proposed PCA-WFL and PCA-AWFL algorithms over the benchmarks.

Details

Language :
English
ISSN :
15361276 and 15582248
Volume :
23
Issue :
6
Database :
Supplemental Index
Journal :
IEEE Transactions on Wireless Communications
Publication Type :
Periodical
Accession number :
ejs66622385
Full Text :
https://doi.org/10.1109/TWC.2023.3329375