Back to Search Start Over

An accelerated proximal gradient method for multiobjective optimization.

Authors :
Tanabe, Hiroki
Fukuda, Ellen H.
Yamashita, Nobuo
Source :
Computational Optimization & Applications; Nov2023, Vol. 86 Issue 2, p421-455, 35p
Publication Year :
2023

Abstract

This paper presents an accelerated proximal gradient method for multiobjective optimization, in which each objective function is the sum of a continuously differentiable, convex function and a closed, proper, convex function. Extending first-order methods for multiobjective problems without scalarization has been widely studied, but providing accelerated methods with accurate proofs of convergence rates remains an open problem. Our proposed method is a multiobjective generalization of the accelerated proximal gradient method, also known as the Fast Iterative Shrinkage-Thresholding Algorithm, for scalar optimization. The key to this successful extension is solving a subproblem with terms exclusive to the multiobjective case. This approach allows us to demonstrate the global convergence rate of the proposed method ( O (1 / k 2) ), using a merit function to measure the complexity. Furthermore, we present an efficient way to solve the subproblem via its dual representation, and we confirm the validity of the proposed method through some numerical experiments. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09266003
Volume :
86
Issue :
2
Database :
Complementary Index
Journal :
Computational Optimization & Applications
Publication Type :
Academic Journal
Accession number :
172361296
Full Text :
https://doi.org/10.1007/s10589-023-00497-w