Back to Search Start Over

Fast Learning Algorithms for Feedforward Neural Networks.

Authors :
Minghu Jiang
Gielen, Georges
Bo Zhang
Zhensheng Luo
Source :
Applied Intelligence; Jan/Feb2003, Vol. 18 Issue 1, p37, 18p, 4 Charts, 6 Graphs
Publication Year :
2003

Abstract

In order to improve the training speed of multilayer feedforward neural networks (MLFNN), we propose and explore two new fast backpropagation (BP) algorithms obtained: (1) by changing the error functions, in case using the exponent attenuation (or bell impulse) function and the Fourier kernel function as alternative functions; and (2) by introducing the hybrid conjugate-gradient algorithm of global optimization for dynamic learning rate to overcome the conventional BP learning problems of getting stuck into local minima or slow convergence. Our experimental results demonstrate the effectiveness of the modified error functions since the training speed is faster than that of existing fast methods. In addition, our hybrid algorithm has a higher recognition rate than the Polak-Ribieve conjugate gradient and conventional BP algorithms, and has less training time, less complication and stronger robustness than the Fletcher-Reeves conjugate-gradient and conventional BP algorithms for real speech data. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0924669X
Volume :
18
Issue :
1
Database :
Complementary Index
Journal :
Applied Intelligence
Publication Type :
Academic Journal
Accession number :
9964598
Full Text :
https://doi.org/10.1023/A:1020922701312