101. Weakly-convex–concave min–max optimization: provable algorithms and applications in machine learning.
- Author
-
Rafique, Hassan, Liu, Mingrui, Lin, Qihang, and Yang, Tianbao
- Subjects
- *
MATHEMATICAL optimization , *SUBGRADIENT methods , *NONSMOOTH optimization , *DATA distribution , *NONCONVEX programming , *ALGORITHMS , *MACHINE learning - Abstract
Min–max problems have broad applications in machine learning, including learning with non-decomposable loss and learning with robustness to data distribution. Convex–concave min–max problem is an active topic of research with efficient algorithms and sound theoretical foundations developed. However, it remains a challenge to design provably efficient algorithms for non-convex min–max problems with or without smoothness. In this paper, we study a family of non-convex min–max problems, whose objective function is weakly convex in the variables of minimization and is concave in the variables of maximization. We propose a proximally guided stochastic subgradient method and a proximally guided stochastic variance-reduced method for the non-smooth and smooth instances, respectively, in this family of problems. We analyse the time complexities of the proposed methods for finding a nearly stationary point of the outer minimization problem corresponding to the min–max problem. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF