Back to Search
Start Over
Constrained Stochastic Recursive Momentum Successive Convex Approximation
- Publication Year :
- 2024
-
Abstract
- We consider stochastic optimization problems with functional constraints, such as those arising in trajectory generation, sparse approximation, and robust classification. To this end, we put forth a recursive momentum-based accelerated successive convex approximation (SCA) algorithm. At each iteration, the proposed algorithm entails constructing convex surrogates of the stochastic objective and the constraint functions, and solving the resulting convex optimization problem. A recursive update rule is employed to track the gradient of the stochastic objective function, which contributes to variance reduction and hence accelerates the algorithm convergence. A key ingredient of the proof is a new parameterized version of the standard Mangasarian-Fromowitz Constraints Qualification, that allows us to bound the dual variables and hence obtain problem-dependent bounds on the rate at which the iterates approach an $\epsilon$-stationary point. Remarkably, the proposed algorithm achieves near-optimal stochastic first order (SFO) complexity, almost at par with that achieved by state-of-the-art stochastic optimization algorithms for solving unconstrained problems. As an example, we detail a obstacle-avoiding trajectory optimization problem that can be solved using the proposed algorithm and show that its performance is superior to that of the existing algorithms used for trajectory optimization. The performance of the proposed algorithm is also shown to be comparable to that of a specialized sparse classification algorithm applied to a binary classification problem.<br />Comment: 32 pages, 4 figures, journal submission
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2404.11790
- Document Type :
- Working Paper