Back to Search Start Over

AdaXod: a new adaptive and momental bound algorithm for training deep neural networks.

Authors :
Liu, Yuanxuan
Li, Dequan
Source :
Journal of Supercomputing; Oct2023, Vol. 79 Issue 15, p17691-17715, 25p
Publication Year :
2023

Abstract

Adaptive algorithms are widely used in deep learning because of their fast convergence. Among them, Adam is the most widely used algorithm. However, studies have shown that Adam's generalization ability is weak. AdaX is a variant of Adam, which introduces a novel second-order momentum, modifies the second-order moment of Adam, and has good generalization ability. However, these algorithms may fail to converge due to instability and extreme learning rates during training. In this paper, we propose a new adaptive and momental bound algorithm, called AdaXod, which characterizes of exponentially averaging the learning rate and is particularly useful for training deep neural networks. By setting an adaptively limited learning rate in the AdaX algorithm, the resultant AdaXod can effectively eliminate the problem of excessive learning rate in the later stage of neural networks training and thus results in stable training. We conduct extensive experiments on different datasets and verify the advantages of the AdaXod algorithm by comparing with other advanced adaptive optimization algorithms. AdaXod eliminates large learning rates during neural networks training and outperforms other optimizers, especially for some neural networks with complex structures, such as DenseNet. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09208542
Volume :
79
Issue :
15
Database :
Complementary Index
Journal :
Journal of Supercomputing
Publication Type :
Academic Journal
Accession number :
171101315
Full Text :
https://doi.org/10.1007/s11227-023-05338-5