Back to Search Start Over

No learning rates needed: Introducing SALSA -- Stable Armijo Line Search Adaptation

Authors :
Kenneweg, Philip
Kenneweg, Tristan
Fumagalli, Fabian
Hammer, Barbara
Publication Year :
2024

Abstract

In recent studies, line search methods have been demonstrated to significantly enhance the performance of conventional stochastic gradient descent techniques across various datasets and architectures, while making an otherwise critical choice of learning rate schedule superfluous. In this paper, we identify problems of current state-of-the-art of line search methods, propose enhancements, and rigorously assess their effectiveness. Furthermore, we evaluate these methods on orders of magnitude larger datasets and more complex data domains than previously done. More specifically, we enhance the Armijo line search method by speeding up its computation and incorporating a momentum term into the Armijo criterion, making it better suited for stochastic mini-batching. Our optimization approach outperforms both the previous Armijo implementation and a tuned learning rate schedule for the Adam and SGD optimizers. Our evaluation covers a diverse range of architectures, such as Transformers, CNNs, and MLPs, as well as data domains, including NLP and image data. Our work is publicly available as a Python package, which provides a simple Pytorch optimizer.<br />Comment: published in IJCNN 2024. arXiv admin note: text overlap with arXiv:2403.18519

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.20650
Document Type :
Working Paper