Back to Search Start Over

Accelerated Log-Regularized Convolutional Transform Learning and Its Convergence Guarantee

Authors :
Haoli Zhao
Shengli Xie
Zuyuan Yang
Yongcheng Guo
Zhenni Li
Source :
IEEE Transactions on Cybernetics. 52:10785-10799
Publication Year :
2022
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2022.

Abstract

Convolutional transform learning (CTL), learning filters by minimizing the data fidelity loss function in an unsupervised way, is becoming very pervasive, resulting from keeping the best of both worlds: the benefit of unsupervised learning and the success of the convolutional neural network. There have been growing interests in developing efficient CTL algorithms. However, developing a convergent and accelerated CTL algorithm with accurate representations simultaneously with proper sparsity is an open problem. This article presents a new CTL framework with a log regularizer that can not only obtain accurate representations but also yield strong sparsity. To efficiently address our nonconvex composite optimization, we propose to employ the proximal difference of the convex algorithm (PDCA) which relies on decomposing the nonconvex regularizer into the difference of two convex parts and then optimizes the convex subproblems. Furthermore, we introduce the extrapolation technology to accelerate the algorithm, leading to a fast and efficient CTL algorithm. In particular, we provide a rigorous convergence analysis for the proposed algorithm under the accelerated PDCA. The experimental results demonstrate that the proposed algorithm can converge more stably to desirable solutions with lower approximation error and simultaneously with stronger sparsity and, thus, learn filters efficiently. Meanwhile, the convergence speed is faster than the existing CTL algorithms.

Details

ISSN :
21682275 and 21682267
Volume :
52
Database :
OpenAIRE
Journal :
IEEE Transactions on Cybernetics
Accession number :
edsair.doi.dedup.....fdc8e245ca6203582fb854caf430b5e5