Back to Search
Start Over
Accelerated Doubly Stochastic Gradient Descent for Tensor CP Decomposition.
- Source :
- Journal of Optimization Theory & Applications; May2023, Vol. 197 Issue 2, p665-704, 40p
- Publication Year :
- 2023
-
Abstract
- In this paper, we focus on the acceleration of doubly stochastic gradient descent method for computing the CANDECOMP/PARAFAC (CP) decomposition of tensors. This optimization problem has N blocks, where N is the order of the tensor. Under the doubly stochastic framework, each block subproblem is solved by the vanilla stochastic gradient method. However, the convergence analysis requires that the variance converges to zero, which is hard to check in practice and may not hold in some implementations. In this paper, we propose accelerating the stochastic gradient method by the momentum acceleration and the variance reduction technique, denoted as DS-MVR. Theoretically, the convergence of DS-MVR only requires the variance to be bounded. Under mild conditions, we show DS-MVR converges to a stochastic ε -stationary solution in O ~ (N 3 / 2 ε - 3) iterations with varying stepsizes and in O (N 3 / 2 ε - 3) iterations with constant stepsizes, respectively. Numerical experiments on four real-world datasets show that our proposed algorithm can get better results compared with the baselines. [ABSTRACT FROM AUTHOR]
- Subjects :
- ALGORITHMS
Subjects
Details
- Language :
- English
- ISSN :
- 00223239
- Volume :
- 197
- Issue :
- 2
- Database :
- Complementary Index
- Journal :
- Journal of Optimization Theory & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 163800419
- Full Text :
- https://doi.org/10.1007/s10957-023-02193-5