Back to Search Start Over

Block coordinate proximal gradient methods with variable Bregman functions for nonsmooth separable optimization.

Authors :
Hua, Xiaoqin
Yamashita, Nobuo
Source :
Mathematical Programming; Nov2016, Vol. 160 Issue 1/2, p1-32, 32p
Publication Year :
2016

Abstract

In this paper, we propose a class of block coordinate proximal gradient (BCPG) methods for solving large-scale nonsmooth separable optimization problems. The proposed BCPG methods are based on the Bregman functions, which may vary at each iteration. These methods include many well-known optimization methods, such as the quasi-Newton method, the block coordinate descent method, and the proximal point method. For the proposed methods, we establish their global convergence properties when the blocks are selected by the Gauss-Seidel rule. Further, under some additional appropriate assumptions, we show that the convergence rate of the proposed methods is R-linear. We also present numerical results for a new BCPG method with variable kernels for a convex problem with separable simplex constraints. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00255610
Volume :
160
Issue :
1/2
Database :
Complementary Index
Journal :
Mathematical Programming
Publication Type :
Academic Journal
Accession number :
118668986
Full Text :
https://doi.org/10.1007/s10107-015-0969-z