Back to Search Start Over

An ODE method to prove the geometric convergence of adaptive stochastic algorithms.

Authors :
Akimoto, Youhei
Auger, Anne
Hansen, Nikolaus
Source :
Stochastic Processes & Their Applications. Mar2022, Vol. 145, p269-307. 39p.
Publication Year :
2022

Abstract

We consider stochastic algorithms derived from methods for solving deterministic optimization problems, especially comparison-based algorithms derived from stochastic approximation algorithms with a constant step-size. We develop a methodology for proving geometric convergence of the parameter sequence { θ n } n ⩾ 0 of such algorithms. We employ the ordinary differential equation (ODE) method, which relates a stochastic algorithm to its mean ODE, along with a Lyapunov-like function Ψ such that the geometric convergence of Ψ (θ n) implies – in the case of an optimization algorithm – the geometric convergence of the expected distance between the optimum and the search point generated by the algorithm. We provide two sufficient conditions for Ψ (θ n) to decrease at a geometric rate: Ψ should decrease "exponentially" along the solution to the mean ODE, and the deviation between the stochastic algorithm and the ODE solution (measured by Ψ) should be bounded by Ψ (θ n) times a constant. We also provide practical conditions under which the two sufficient conditions may be verified easily without knowing the solution of the mean ODE. Our results are any-time bounds on Ψ (θ n) , so we can deduce not only the asymptotic upper bound on the convergence rate, but also the first hitting time of the algorithm. The main results are applied to a comparison-based stochastic algorithm with a constant step-size for optimization on continuous domains. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03044149
Volume :
145
Database :
Academic Search Index
Journal :
Stochastic Processes & Their Applications
Publication Type :
Academic Journal
Accession number :
154950176
Full Text :
https://doi.org/10.1016/j.spa.2021.12.005