Back to Search Start Over

Gossip-based distributed stochastic mirror descent for constrained optimization.

Authors :
Fang, Xianju
Zhang, Baoyong
Yuan, Deming
Source :
Neural Networks. Jul2024, Vol. 175, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

This paper considers a distributed constrained optimization problem over a multi-agent network in the non-Euclidean sense. The gossip protocol is adopted to relieve the communication burden, which also adapts to the constantly changing topology of the network. Based on this idea, a gossip-based distributed stochastic mirror descent (GB-DSMD) algorithm is proposed to handle the problem under consideration. The performances of GB-DSMD algorithms with constant and diminishing step sizes are analyzed, respectively. When the step size is constant, the error bound between the optimal function value and the expected function value corresponding to the average iteration output of the algorithm is derived. While for the case of the diminishing step size, it is proved that the output of the algorithm uniformly approaches to the optimal value with probability 1. Finally, as a numerical example, the distributed logistic regression is reported to demonstrate the effectiveness of the GB-DSMD algorithm. • A Gossip-Based Distributed Stochastic Mirror Descent (GB-DSMD) algorithm is proposed. • In the proposed algorithm, appropriate step-size can be chosen by each agent based on local information. • The impact of two different step sizes on the performance of GB-DSMD algorithm is analyzed. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
175
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
177107883
Full Text :
https://doi.org/10.1016/j.neunet.2024.106291