Back to Search Start Over

Quantized Minimum Error Entropy Criterion.

Authors :
Chen, Badong
Xing, Lei
Zheng, Nanning
Principe, Jose C.
Source :
IEEE Transactions on Neural Networks & Learning Systems. May2019, Vol. 30 Issue 5, p1370-1380. 11p.
Publication Year :
2019

Abstract

Comparing with traditional learning criteria, such as mean square error, the minimum error entropy (MEE) criterion is superior in nonlinear and non-Gaussian signal processing and machine learning. The argument of the logarithm in Renyi’s entropy estimator, called information potential (IP), is a popular MEE cost in information theoretic learning. The computational complexity of IP is, however, quadratic in terms of sample number due to double summation. This creates the computational bottlenecks, especially for large-scale data sets. To address this problem, in this paper, we propose an efficient quantization approach to reduce the computational burden of IP, which decreases the complexity from $O({N^{2}})$ to $O({MN})$ with $M \ll N$. The new learning criterion is called the quantized MEE (QMEE). Some basic properties of QMEE are presented. Illustrative examples with linear-in-parameter models are provided to verify the excellent performance of QMEE. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
2162237X
Volume :
30
Issue :
5
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
136117569
Full Text :
https://doi.org/10.1109/TNNLS.2018.2868812