1. Randomized Submanifold Subgradient Method for Optimization over Stiefel Manifolds
- Author
-
Cheung, Andy Yat-Ming, Wang, Jinxin, Yue, Man-Chung, and So, Anthony Man-Cho
- Subjects
Mathematics - Optimization and Control - Abstract
Optimization over Stiefel manifolds has found wide applications in many scientific and engineering domains. Despite considerable research effort, high-dimensional optimization problems over Stiefel manifolds remain challenging, and the situation is exacerbated by nonsmooth objective functions. The purpose of this paper is to propose and study a novel coordinate-type algorithm for weakly convex (possibly nonsmooth) optimization problems over high-dimensional Stiefel manifolds, named randomized submanifold subgradient method (RSSM). Similar to coordinate-type algorithms in the Euclidean setting, RSSM exhibits low per-iteration cost and is suitable for high-dimensional problems. We prove that RSSM converges to the set of stationary points and attains $\varepsilon$-stationary points with respect to a natural stationarity measure in $\mathcal{O}(\varepsilon^{-4})$ iterations in both expectation and the almost-sure senses. To the best of our knowledge, these are the first convergence guarantees for coordinate-type algorithms to optimize nonconvex nonsmooth functions over Stiefel manifolds. An important technical tool in our convergence analysis is a new Riemannian subgradient inequality for weakly convex functions on proximally smooth matrix manifolds, which could be of independent interest.
- Published
- 2024