Back to Search Start Over

RANDOM COORDINATE DESCENT METHODS FOR NONSEPARABLE COMPOSITE OPTIMIZATION.

Authors :
CHOROBURA, FLAVIA
NECOARA, ION
Source :
SIAM Journal on Optimization. 2023, Vol. 33 Issue 3, p2160-2190. 31p.
Publication Year :
2023

Abstract

In this paper we consider large-scale composite optimization problems having the objective function formed as a sum of two terms (possibly nonconvex); one has a (block) coordinatewise Lipschitz continuous gradient and the other is differentiable but nonseparable. Under these general settings we derive and analyze two new coordinate descent methods. The first algorithm, referred to as the coordinate proximal gradient method, considers the composite form of the objective function, while the other algorithm disregards the composite form of the objective and uses the partial gradient of the full objective, yielding a coordinate gradient descent scheme with novel adaptive stepsize rules. We prove that these new stepsize rules make the coordinate gradient scheme a descent method, provided that additional assumptions hold for the second term in the objective function. We present a complete worst-case complexity analysis for these two new methods in both convex and nonconvex settings, provided that the (block) coordinates are chosen random or cyclic. Preliminary numerical results also confirm the efficiency of our two algorithms for practical problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10526234
Volume :
33
Issue :
3
Database :
Academic Search Index
Journal :
SIAM Journal on Optimization
Publication Type :
Academic Journal
Accession number :
173676840
Full Text :
https://doi.org/10.1137/22M148700X