Back to Search Start Over

Probabilistic learning rate scheduler with provable convergence

Authors :
Devapriya, Dahlia
Tholeti, Thulasi
Suresh, Janani
Kalyani, Sheetal
Publication Year :
2024

Abstract

Learning rate schedulers have shown great success in speeding up the convergence of learning algorithms in practice. However, their convergence to a minimum has not been proven theoretically. This difficulty mainly arises from the fact that, while traditional convergence analysis prescribes to monotonically decreasing (or constant) learning rates, schedulers opt for rates that often increase and decrease through the training epochs. In this work, we aim to bridge the gap by proposing a probabilistic learning rate scheduler (PLRS), that does not conform to the monotonically decreasing condition, with provable convergence guarantees. In addition to providing detailed convergence proofs, we also show experimental results where the proposed PLRS performs competitively as other state-of-the-art learning rate schedulers across a variety of datasets and architectures.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.07613
Document Type :
Working Paper