Back to Search Start Over

Convergence rates analysis of Interior Bregman Gradient Method for Vector Optimization Problems

Authors :
Chen, Jian
Tang, Liping
Yang, Xinmin
Publication Year :
2022

Abstract

In recent years, by using Bregman distance, the Lipschitz gradient continuity and strong convexity were lifted and replaced by relative smoothness and relative strong convexity. Under the mild assumptions, it was proved that gradient methods with Bregman regularity converge linearly for single-objective optimization problems (SOPs). In this paper, we extend the relative smoothness and relative strong convexity to vector-valued functions and analyze the convergence of an interior Bregman gradient method for vector optimization problems (VOPs). Specifically, the global convergence rates are $\mathcal{O}(\frac{1}{k})$ and $\mathcal{O}(r^{k})(0<r<1)$ for convex and relative strongly convex VOPs, respectively. Moreover, the proposed method converges linearly for VOPs that satisfy a vector Bregman-PL inequality.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2206.10070
Document Type :
Working Paper