Back to Search
Start Over
Stochastic Proximal Methods for Non-Smooth Non-Convex Constrained Sparse Optimization.
- Source :
-
Journal of Machine Learning Research . 2021, Vol. 22, p1-36. 36p. - Publication Year :
- 2021
-
Abstract
- This paper focuses on stochastic proximal gradient methods for optimizing a smooth nonconvex loss function with a non-smooth non-convex regularizer and convex constraints. To the best of our knowledge we present the rst non-asymptotic convergence bounds for this class of problem. We present two simple stochastic proximal gradient algorithms, for general stochastic and nite-sum optimization problems. In a numerical experiment we compare our algorithms with the current state-of-the-art deterministic algorithm and nd our algorithms to exhibit superior convergence. [ABSTRACT FROM AUTHOR]
- Subjects :
- *CONSTRAINED optimization
*NONSMOOTH optimization
*ALGORITHMS
*CONVEX functions
Subjects
Details
- Language :
- English
- ISSN :
- 15324435
- Volume :
- 22
- Database :
- Academic Search Index
- Journal :
- Journal of Machine Learning Research
- Publication Type :
- Academic Journal
- Accession number :
- 155404605