Back to Search Start Over

Accelerating regularized tensor decomposition using the alternating direction method of multipliers with multiple Nesterov's extrapolations.

Authors :
Wang, Deqing
Hu, Guoqiang
Source :
Signal Processing. Sep2024, Vol. 222, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Tensor decomposition is an essential tool for multiway signal processing. At present, large-scale high-order tensor data require fast and efficient decomposing algorithms. In this paper, we propose accelerated regularized tensor decomposition algorithms using the alternating direction method of multipliers with multiple Nesterov's extrapolations in the block coordinate descent framework. We implement the acceleration in three cases: only in the inner loop, only in the outer loop, and in both the inner and outer loops. Adaptive safeguard strategies are developed following the acceleration to guarantee monotonic convergence. Afterwards, we utilize the proposed algorithms to accelerate two types of conventional decomposition: nonnegative CANDECOMP/PARAFAC (NCP) and sparse CANDECOMP/PARAFAC (SCP). The experimental results on synthetic and real-world tensors demonstrate that the proposed algorithms achieve significant acceleration effects and outperform state-of-the-art algorithms. The accelerated algorithm with extrapolations in both the inner and outer loops has the fastest convergence speed and takes almost one-third of the running time of typical algorithms. • Accelerate regularized tensor decomposition using ADMM with multiple extrapolations. • Extrapolations are utilized in three cases: inner loop, outer loop, and both loops. • Adaptive safeguard strategies are developed to guarantee monotonic convergence. • The accelerated algorithms converge in almost one-third of conventional running time. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01651684
Volume :
222
Database :
Academic Search Index
Journal :
Signal Processing
Publication Type :
Academic Journal
Accession number :
177652565
Full Text :
https://doi.org/10.1016/j.sigpro.2024.109532