Back to Search Start Over

Restarting Frank–Wolfe: Faster Rates under Hölderian Error Bounds.

Authors :
Kerdreux, Thomas
d'Aspremont, Alexandre
Pokutta, Sebastian
Source :
Journal of Optimization Theory & Applications; Mar2022, Vol. 192 Issue 3, p799-829, 31p
Publication Year :
2022

Abstract

Conditional gradient algorithms (aka Frank–Wolfe algorithms) form a classical set of methods for constrained smooth convex minimization due to their simplicity, the absence of projection steps, and competitive numerical performance. While the vanilla Frank–Wolfe algorithm only ensures a worst-case rate of O (1 / ϵ) , various recent results have shown that for strongly convex functions on polytopes, the method can be slightly modified to achieve linear convergence. However, this still leaves a huge gap between sublinear O (1 / ϵ) convergence and linear O (log 1 / ϵ) convergence to reach an ϵ -approximate solution. Here, we present a new variant of conditional gradient algorithms that can dynamically adapt to the function's geometric properties using restarts and smoothly interpolates between the sublinear and linear regimes. These interpolated convergence rates are obtained when the optimization problem satisfies a new type of error bounds, which we call strong Wolfe primal bounds. They combine geometric information on the constraint set with Hölderian error bounds on the objective function. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00223239
Volume :
192
Issue :
3
Database :
Complementary Index
Journal :
Journal of Optimization Theory & Applications
Publication Type :
Academic Journal
Accession number :
155719886
Full Text :
https://doi.org/10.1007/s10957-021-01989-7