Back to Search Start Over

A One-Class Kernel Fisher Criterion for Outlier Detection.

Authors :
Dufrenois, Franck
Source :
IEEE Transactions on Neural Networks & Learning Systems; May2015, Vol. 26 Issue 5, p982-994, 13p
Publication Year :
2015

Abstract

Recently, Dufrenois and Noyer proposed a one class Fisher’s linear discriminant to isolate normal data from outliers. In this paper, a kernelized version of their criterion is presented. Originally on the basis of an iterative optimization process, alternating between subspace selection and clustering, I show here that their criterion has an upper bound making these two problems independent. In particular, the estimation of the label vector is formulated as an unconstrained binary linear problem (UBLP) which can be solved using an iterative perturbation method. Once the label vector is estimated, an optimal projection subspace is obtained by solving a generalized eigenvalue problem. Like many other kernel methods, the performance of the proposed approach depends on the choice of the kernel. Constructed with a Gaussian kernel, I show that the proposed contrast measure is an efficient indicator for selecting an optimal kernel width. This property simplifies the model selection problem which is typically solved by costly (generalized) cross-validation procedures. Initialization, convergence analysis, and computational complexity are also discussed. Lastly, the proposed algorithm is compared with recent novelty detectors on synthetic and real data sets. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
2162237X
Volume :
26
Issue :
5
Database :
Complementary Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
102120285
Full Text :
https://doi.org/10.1109/TNNLS.2014.2329534