Back to Search Start Over

Convergence rate analysis of proximal gradient methods with applications to composite minimization problems.

Authors :
Sahu, D. R.
Yao, J. C.
Verma, M.
Shukla, K. K.
Source :
Optimization. Jan2021, Vol. 70 Issue 1, p75-100. 26p.
Publication Year :
2021

Abstract

First-order methods such as proximal gradient, which use Forward–Backward Splitting techniques have proved to be very effective in solving nonsmooth convex minimization problem, which is useful in solving various practical problems in different fields such as machine learning and image processing. In this paper, we propose few new forward–backward splitting algorithms, which consume less number of iterations to converge to an optimum. In addition, we derive convergence rates for the proposed formulations and show that the speed of convergence of these algorithms is significantly better than the traditional forward–backward algorithm. To demonstrate the practical applicability, we apply them to two real-world problems of machine learning and image processing. The first issue deals with the regression on high-dimensional datasets, whereas the second one is the image deblurring problem. Numerical experiments have been conducted on several publicly available real datasets to verify the obtained theoretical results. Results demonstrate the superiority of our algorithms in terms of accuracy, the number of iterations required to converge and the rate of convergence against the classical first-order methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02331934
Volume :
70
Issue :
1
Database :
Academic Search Index
Journal :
Optimization
Publication Type :
Academic Journal
Accession number :
147951250
Full Text :
https://doi.org/10.1080/02331934.2019.1702040