Back to Search Start Over

Improving Accelerated Federated Learning with Compression and Importance Sampling

Authors :
Grudzień, Michał
Malinovsky, Grigory
Richtárik, Peter
Publication Year :
2023

Abstract

Federated Learning is a collaborative training framework that leverages heterogeneous data distributed across a vast number of clients. Since it is practically infeasible to request and process all clients during the aggregation step, partial participation must be supported. In this setting, the communication between the server and clients poses a major bottleneck. To reduce communication loads, there are two main approaches: compression and local steps. Recent work by Mishchenko et al. [2022] introduced the new ProxSkip method, which achieves an accelerated rate using the local steps technique. Follow-up works successfully combined local steps acceleration with partial participation [Grudzie\'n et al., 2023, Condat et al. 2023] and gradient compression [Condat et al. [2022]. In this paper, we finally present a complete method for Federated Learning that incorporates all necessary ingredients: Local Training, Compression, and Partial Participation. We obtain state-of-the-art convergence guarantees in the considered setting. Moreover, we analyze the general sampling framework for partial participation and derive an importance sampling scheme, which leads to even better performance. We experimentally demonstrate the advantages of the proposed method in practice.<br />Comment: 33 pages, 3 algorithms, 1 figure

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2306.03240
Document Type :
Working Paper