Back to Search Start Over

Optimized Kernel Entropy Components.

Authors :
Izquierdo-Verdiguier, Emma
Laparra, Valero
Jenssen, Robert
Gomez-Chova, Luis
Camps-Valls, Gustau
Source :
IEEE Transactions on Neural Networks & Learning Systems; Jun2017, Vol. 28 Issue 6, p1466-1472, 7p
Publication Year :
2017

Abstract

This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
2162237X
Volume :
28
Issue :
6
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
123183863
Full Text :
https://doi.org/10.1109/TNNLS.2016.2530403