Back to Search Start Over

Effective Discriminative Feature Selection With Nontrivial Solution.

Authors :
Tao, Hong
Hou, Chenping
Nie, Feiping
Jiao, Yuanyuan
Yi, Dongyun
Source :
IEEE Transactions on Neural Networks & Learning Systems. Apr2016, Vol. 27 Issue 4, p796-808. 13p.
Publication Year :
2016

Abstract

Feature selection and feature transformation, the two main ways to reduce dimensionality, are often presented separately. In this paper, a feature selection method is proposed by combining the popular transformation-based dimensionality reduction method linear discriminant analysis (LDA) and sparsity regularization. We impose row sparsity on the transformation matrix of LDA through \ell 2,1 -norm regularization to achieve feature selection, and the resultant formulation optimizes for selecting the most discriminative features and removing the redundant ones simultaneously. The formulation is extended to the \ell 2,p -norm regularized case, which is more likely to offer better sparsity when $0<p<1$ . Thus, the formulation is a better approximation to the feature selection problem. An efficient algorithm is developed to solve the \ell 2,p -norm-based optimization problem and it is proved that the algorithm converges when $0<p\le 2$ . Systematical experiments are conducted to understand the work of the proposed method. Promising experimental results on various types of real-world data sets demonstrate the effectiveness of our algorithm. [ABSTRACT FROM PUBLISHER]

Details

Language :
English
ISSN :
2162237X
Volume :
27
Issue :
4
Database :
Academic Search Index
Journal :
IEEE Transactions on Neural Networks & Learning Systems
Publication Type :
Periodical
Accession number :
113872494
Full Text :
https://doi.org/10.1109/TNNLS.2015.2424721