Back to Search Start Over

A hybrid training algorithm based on gradient descent and evolutionary computation.

Authors :
Xue, Yu
Tong, Yiling
Neri, Ferrante
Source :
Applied Intelligence; Sep2023, Vol. 53 Issue 18, p21465-21482, 18p
Publication Year :
2023

Abstract

Back propagation (BP) is widely used for parameter search of fully-connected layers in many neural networks. Although BP has the potential of quickly converging to a solution, due to its gradient-based nature, it tends to fall into a local optimum. Metaheuristics such as evolutionary computation (EC) techniques, as gradient-free methods, may have excellent global search capability due to their stochastic nature. However, these techniques tend to perform worse than BP in terms of convergence speed. In this paper, a hybrid gradient descent search algorithm (HGDSA) is proposed for training the parameters in fully-connected neural networks. HGDSA initially searches the space extensively by means of an ensemble of gradient descent strategies in the early stage and then uses BP as an exploitative local search operator. Moreover, a self-adaptive method which selects strategies and updates the learning rates of strategies has been designed and embedded in the global search operators to prevent stagnation in local optima. To verify the effectiveness of HGDSA, experiments were performed on eleven classification datasets. Experimental results demonstrate that the proposed HGDSA possesses both powerful global and local search abilities. Furthermore, the proposed approach appears to be promising also on high-dimensional datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0924669X
Volume :
53
Issue :
18
Database :
Complementary Index
Journal :
Applied Intelligence
Publication Type :
Academic Journal
Accession number :
172020497
Full Text :
https://doi.org/10.1007/s10489-023-04595-4