Back to Search Start Over

A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks.

Authors :
Heravi, Ahmad Reza
Abed Hodtani, Ghosheh
Source :
IEEE Transactions on Neural Networks & Learning Systems; Dec2018, Vol. 29 Issue 12, p6252-6263, 12p
Publication Year :
2018

Abstract

Mean square error (MSE) is the most prominent criterion in training neural networks and has been employed in numerous learning problems. In this paper, we suggest a group of novel robust information theoretic backpropagation (BP) methods, as correntropy-based conjugate gradient BP (CCG-BP). CCG-BP algorithms converge faster than the common correntropy-based BP algorithms and have better performance than the common CG-BP algorithms based on MSE, especially in nonGaussian environments and in cases with impulsive noise or heavy-tailed distributions noise. In addition, a convergence analysis of this new type of method is particularly considered. Numerical results for several samples of function approximation, synthetic function estimation, and chaotic time series prediction illustrate that our new BP method is more robust than the MSE-based method in the sense of impulsive noise, especially when SNR is low. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
29
Issue :
12
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
133211394
Full Text :
https://doi.org/10.1109/TNNLS.2018.2827778