Back to Search Start Over

Sparse Graph Embedding Unsupervised Feature Selection.

Authors :
Wang, Shiping
Zhu, William
Source :
IEEE Transactions on Systems, Man & Cybernetics. Systems; Mar2018, Vol. 48 Issue 3, p329-341, 13p
Publication Year :
2018

Abstract

High dimensionality is quite commonly encountered in data mining problems, and hence dimensionality reduction becomes an important task in order to improve the efficiency of learning algorithms. As a widely used technique of dimensionality reduction, feature selection is about selecting a feature subset being guided by certain criterion. In this paper, three unsupervised feature selection algorithms are proposed and addressed from the viewpoint of sparse graph embedding learning. First, using the self-characterization of the given data, we view the data themselves as a dictionary, conduct sparse coding and propose the sparsity preserving feature selection (SPFS) algorithm. Second, considering the locality preservation of neighborhoods for the data, we study a special case of the SPFS problem, namely, neighborhood preserving feature selection problem, and come up with a suitable algorithm. Third, we incorporate sparse coding and feature selection into one unified framework, and propose a neighborhood embedding feature selection (NEFS) criterion. Drawing support from nonnegative matrix factorization, the corresponding algorithm for NEFS is presented and its convergence is proved. Finally, the three proposed algorithms are validated with the use of eight publicly available real-world datasets from machine learning repository. Extensive experimental results demonstrate the superiority of the proposed algorithms over four compared state-of-the-art unsupervised feature selection methods. [ABSTRACT FROM PUBLISHER]

Subjects

Subjects :
SPARSE graphs
MACHINE learning

Details

Language :
English
ISSN :
21682216
Volume :
48
Issue :
3
Database :
Complementary Index
Journal :
IEEE Transactions on Systems, Man & Cybernetics. Systems
Publication Type :
Academic Journal
Accession number :
128054416
Full Text :
https://doi.org/10.1109/TSMC.2016.2605132