Back to Search Start Over

Analysis of Gradient Descent Methods With Nondiminishing Bounded Errors.

Authors :
Ramaswamy, Arunselvan
Bhatnagar, Shalabh
Source :
IEEE Transactions on Automatic Control. May2018, Vol. 63 Issue 5, p1465-1471. 7p.
Publication Year :
2018

Abstract

The main aim of this paper is to provide an analysis of gradient descent ( $\text{GD}$ ) algorithms with gradient errors that do not necessarily vanish, asymptotically. In particular, sufficient conditions are presented for both stability (almost sure boundedness of the iterates) and convergence of $\text{GD}$ with bounded (possibly) nondiminishing gradient errors. In addition to ensuring stability, such an algorithm is shown to converge to a small neighborhood of the minimum set, which depends on the gradient errors. It is worth noting that the main result of this paper can be used to show that $\text{GD}$ with asymptotically vanishing errors indeed converges to the minimum set. The results presented herein are not only more general when compared to previous results, but our analysis of $\text{GD}$ with errors is new to the literature to the best of our knowledge. Our work extends the contributions of Mangasarian and Solodov, Bertsekas and Tsitsiklis, and Tadić and Doucet. Using our framework, a simple yet effective implementation of $\text{GD}$ using simultaneous perturbation stochastic approximations, with constant sensitivity parameters, is presented. Another important improvement over many previous results is that there are no “additional” restrictions imposed on the step sizes. In machine learning applications where step sizes are related to learning rates, our assumptions, unlike those of other papers, do not affect these learning rates. Finally, we present experimental results to validate our theory. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189286
Volume :
63
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Automatic Control
Publication Type :
Periodical
Accession number :
130091916
Full Text :
https://doi.org/10.1109/TAC.2017.2744598