Back to Search Start Over

On the Interplay Between Stepsize Tuning and Progressive Sharpening

Authors :
Roulet, Vincent
Agarwala, Atish
Pedregosa, Fabian
Publication Year :
2023

Abstract

Recent empirical work has revealed an intriguing property of deep learning models by which the sharpness (largest eigenvalue of the Hessian) increases throughout optimization until it stabilizes around a critical value at which the optimizer operates at the edge of stability, given a fixed stepsize (Cohen et al, 2022). We investigate empirically how the sharpness evolves when using stepsize-tuners, the Armijo linesearch and Polyak stepsizes, that adapt the stepsize along the iterations to local quantities such as, implicitly, the sharpness itself. We find that the surprisingly poor performance of a classical Armijo linesearch in the deterministic setting may be well explained by its tendency to ever-increase the sharpness of the objective. On the other hand, we observe that Polyak stepsizes operate generally at the edge of stability or even slightly beyond, outperforming its Armijo and constant stepsizes counterparts in the deterministic setting. We conclude with an analysis that suggests unlocking stepsize tuners requires an understanding of the joint dynamics of the step size and the sharpness.<br />Comment: Presented at the NeurIPS 2023 OPT Wokshop

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.00209
Document Type :
Working Paper