Back to Search Start Over

A General System of Differential Equations to Model First-Order Adaptive Algorithms.

Authors :
da Silva, André Belotto
Gazeau, Maxime
Source :
Journal of Machine Learning Research. 2020, Issue 119-145, p1-42. 42p.
Publication Year :
2020

Abstract

First-order optimization algorithms play a major role in large scale machine learning. A new class of methods, called adaptive algorithms, were recently introduced to adjust iteratively the learning rate for each coordinate. Despite great practical success in deep learning, their behavior and performance on more general loss functions are not well understood. In this paper, we derive a non-autonomous system of differential equations, which is the continuous time limit of adaptive optimization methods. We study the convergence of its trajectories and give conditions under which the differential system, underlying all adaptive algorithms, is suitable for optimization. We discuss convergence to a critical point in the non-convex case and give conditions for the dynamics to avoid saddle points and local maxima. For convex loss function, we introduce a suitable Lyapunov functional which allows us to study its rate of convergence. Several other properties of both the continuous and discrete systems are brie y discussed. The differential system studied in the paper is general enough to encompass many other classical algorithms (such as Heavy Ball and Nesterov's accelerated method) and allow us to recover several known results for these algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15324435
Issue :
119-145
Database :
Academic Search Index
Journal :
Journal of Machine Learning Research
Publication Type :
Academic Journal
Accession number :
145461074