Back to Search Start Over

Quantized Distributed Nonconvex Optimization Algorithms with Linear Convergence under the Polyak--$Ł$ojasiewicz Condition

Authors :
Xu, Lei
Yi, Xinlei
Sun, Jiayue
Shi, Yang
Johansson, Karl H.
Yang, Tao
Publication Year :
2022
Publisher :
arXiv, 2022.

Abstract

This paper considers distributed optimization for minimizing the average of local nonconvex cost functions, by using local information exchange over undirected communication networks. To reduce the required communication capacity, we introduce an encoder--decoder scheme. By integrating them with distributed gradient tracking and proportional integral algorithms, respectively, we then propose two quantized distributed nonconvex optimization algorithms. Assuming the global cost function satisfies the Polyak--{\L}ojasiewicz condition, which does not require the global cost function to be convex and the global minimizer is not necessarily unique, we show that our proposed algorithms linearly converge to a global optimal point and that larger quantization level leads to faster convergence speed. Moreover, we show that a low data rate is sufficient to guarantee linear convergence when the algorithm parameters are properly chosen. The theoretical results are illustrated by numerical examples.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....b141596d985c9d5194baeffe67d6dfff
Full Text :
https://doi.org/10.48550/arxiv.2207.08106