Back to Search Start Over

Accelerated gradient algorithm for RBF neural network.

Authors :
Han, Hong-Gui
Ma, Miao-Li
Qiao, Jun-Fei
Source :
Neurocomputing. Jun2021, Vol. 441, p237-247. 11p.
Publication Year :
2021

Abstract

Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, one of the challenges in the training process is determining how to avoid vanishing gradient. To solve this problem, an accelerated gradient algorithm (AGA) is designed to improve the learning performance of RBFNN in this paper. First, an indirect detection mechanism, based on the instantaneous gradient decay rate (IGDR) and instantaneous convergence rate (ICR), is developed to identify the vanishing gradient in learning process. Second, an amplification gradient strategy (AGS), which can increase the gradient value of learning parameters, is designed to accelerate the learning speed of RBFNN. Third, the analysis of AGA-based RBFNN (AGA-RBFNN) is given to guarantee the successful application. Finally, some benchmark and real problems are used to illustrate the effectiveness of AGA-RBFNN. The results demonstrate the effectiveness of AGA-RBFNN. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
441
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
149967748
Full Text :
https://doi.org/10.1016/j.neucom.2021.02.009