Back to Search Start Over

A Novel Convergence Analysis for Algorithms of the Adam Family and Beyond

Authors :
Guo, Zhishuai
Xu, Yi
Yin, Wotao
Jin, Rong
Yang, Tianbao
Publication Year :
2021

Abstract

Why does the original analysis of Adam fail, but it still converges very well in practice on a broad range of problems? There are still some mysteries about Adam that have not been unraveled. This paper provides a novel non-convex analysis of Adam and its many variants to uncover some of these mysteries. Our analysis exhibits that an increasing or large enough "momentum" parameter for the first-order moment used in practice is sufficient to ensure Adam and its many variants converge under a mild boundness condition on the adaptive scaling factor of the step size. In contrast, the original problematic analysis of Adam uses a momentum parameter that decreases to zero, which is the key reason that makes it diverge on some problems. To the best of our knowledge, this is the first time the gap between analysis and practice is bridged. Our analysis also exhibits more insights for practical implementations of Adam, e.g., increasing the momentum parameter in a stage-wise manner in accordance with stagewise decreasing step size would help improve the convergence. Our analysis of the Adam family is modular such that it can be (has been) extended to solving other optimization problems, e.g., compositional, min-max and bi-level problems. As an interesting yet non-trivial use case, we present an extension for solving non-convex min-max optimization in order to address a gap in the literature that either requires a large batch or has double loops. Our empirical studies corroborate the theory and also demonstrate the effectiveness in solving min-max problems.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2104.14840
Document Type :
Working Paper