Back to Search Start Over

Multi-Consensus Decentralized Accelerated Gradient Descent.

Authors :
Haishan Ye
Luo Luo
Ziang Zhou
Tong Zhang
Source :
Journal of Machine Learning Research. 2023, Vol. 24, p1-50. 50p.
Publication Year :
2023

Abstract

This paper considers the decentralized convex optimization problem, which has a wide range of applications in large-scale machine learning, sensor networks, and control theory. We propose novel algorithms that achieve optimal computation complexity and near optimal communication complexity. Our theoretical results give affirmative answers to the open problem on whether there exists an algorithm that can achieve a communication complexity (nearly) matching the lower bound depending on the global condition number instead of the local one. Furthermore, the linear convergence of our algorithms only depends on the strong convexity of global objective and it does not require the local functions to be convex. The design of our methods relies on a novel integration of well-known techniques including Nesterov's acceleration, multi-consensus and gradient-tracking. Empirical studies show the outperformance of our methods for machine learning applications. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15324435
Volume :
24
Database :
Academic Search Index
Journal :
Journal of Machine Learning Research
Publication Type :
Academic Journal
Accession number :
176355510