Back to Search Start Over

Accelerated Convex Optimization with Stochastic Gradients: Generalizing the Strong-Growth Condition

Authors :
Valls, Víctor
Wang, Shiqiang
Jiang, Yuang
Tassiulas, Leandros
Publication Year :
2022

Abstract

This paper presents a sufficient condition for stochastic gradients not to slow down the convergence of Nesterov's accelerated gradient method. The new condition has the strong-growth condition by Schmidt \& Roux as a special case, and it also allows us to (i) model problems with constraints and (ii) design new types of oracles (e.g., oracles for finite-sum problems such as SAGA). Our results are obtained by revisiting Nesterov's accelerated algorithm and are useful for designing stochastic oracles without changing the underlying first-order method.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....d43dc515cb86e7a23dd1bccab2b566fa