1. Methods with Local Steps and Random Reshuffling for Generally Smooth Non-Convex Federated Optimization
- Author
-
Demidovich, Yury, Ostroukhov, Petr, Malinovsky, Grigory, Horváth, Samuel, Takáč, Martin, Richtárik, Peter, and Gorbunov, Eduard
- Subjects
Mathematics - Optimization and Control ,Computer Science - Machine Learning - Abstract
Non-convex Machine Learning problems typically do not adhere to the standard smoothness assumption. Based on empirical findings, Zhang et al. (2020b) proposed a more realistic generalized $(L_0, L_1)$-smoothness assumption, though it remains largely unexplored. Many existing algorithms designed for standard smooth problems need to be revised. However, in the context of Federated Learning, only a few works address this problem but rely on additional limiting assumptions. In this paper, we address this gap in the literature: we propose and analyze new methods with local steps, partial participation of clients, and Random Reshuffling without extra restrictive assumptions beyond generalized smoothness. The proposed methods are based on the proper interplay between clients' and server's stepsizes and gradient clipping. Furthermore, we perform the first analysis of these methods under the Polyak-{\L} ojasiewicz condition. Our theory is consistent with the known results for standard smooth problems, and our experimental results support the theoretical insights.
- Published
- 2024