1. Multikernel Correntropy for Robust Learning
- Author
-
Pengju Ren, Zejian Yuan, Yuqing Xie, Badong Chen, Xin Wang, and Jing Qin
- Subjects
Signal processing ,Computer science ,business.industry ,Gaussian ,Signal Processing, Computer-Assisted ,Multikernel ,Pattern recognition ,Similarity measure ,Computer Science Applications ,Machine Learning ,Human-Computer Interaction ,symbols.namesake ,Control and Systems Engineering ,Outlier ,Gaussian function ,symbols ,Artificial intelligence ,Electrical and Electronic Engineering ,business ,Linear combination ,Random variable ,Algorithms ,Software ,Information Systems - Abstract
As a novel similarity measure that is defined as the expectation of a kernel function between two random variables, correntropy has been successfully applied in robust machine learning and signal processing to combat large outliers. The kernel function in correntropy is usually a zero-mean Gaussian kernel. In a recent work, the concept of mixture correntropy (MC) was proposed to improve the learning performance, where the kernel function is a mixture Gaussian kernel, namely, a linear combination of several zero-mean Gaussian kernels with different widths. In both correntropy and MC, the center of the kernel function is, however, always located at zero. In the present work, to further improve the learning performance, we propose the concept of multikernel correntropy (MKC), in which each component of the mixture Gaussian kernel can be centered at a different location. The properties of the MKC are investigated and an efficient approach is proposed to determine the free parameters in MKC. Experimental results show that the learning algorithms under the maximum MKC criterion (MMKCC) can outperform those under the original maximum correntropy criterion (MCC) and the maximum MC criterion (MMCC).
- Published
- 2022