Back to Search Start Over

Universal Approximation by Using the Correntropy Objective Function

Authors :
Nayyeri, Mojtaba
Sadoghi Yazdi, Hadi
Maskooki, Alaleh
Rouhani, Modjtaba
Source :
IEEE Transactions on Neural Networks and Learning Systems; September 2018, Vol. 29 Issue: 9 p4515-4521, 7p
Publication Year :
2018

Abstract

Several objective functions have been proposed in the literature to adjust the input parameters of a node in constructive networks. Furthermore, many researchers have focused on the universal approximation capability of the network based on the existing objective functions. In this brief, we use a correntropy measure based on the sigmoid kernel in the objective function to adjust the input parameters of a newly added node in a cascade network. The proposed network is shown to be capable of approximating any continuous nonlinear mapping with probability one in a compact input sample space. Thus, the convergence is guaranteed. The performance of our method was compared with that of eight different objective functions, as well as with an existing one hidden layer feedforward network on several real regression data sets with and without impulsive noise. The experimental results indicate the benefits of using a correntropy measure in reducing the root mean square error and increasing the robustness to noise.

Details

Language :
English
ISSN :
2162237x and 21622388
Volume :
29
Issue :
9
Database :
Supplemental Index
Journal :
IEEE Transactions on Neural Networks and Learning Systems
Publication Type :
Periodical
Accession number :
ejs46463916
Full Text :
https://doi.org/10.1109/TNNLS.2017.2753725