Back to Search
Start Over
Unsupervised Feature Selection With Extended OLSDA via Embedding Nonnegative Manifold Structure.
- Source :
-
IEEE Transactions on Neural Networks & Learning Systems . May2022, Vol. 33 Issue 5, p2274-2280. 7p. - Publication Year :
- 2022
-
Abstract
- As to unsupervised learning, most discriminative information is encoded in the cluster labels. To obtain the pseudo labels, unsupervised feature selection methods usually utilize spectral clustering to generate them. Nonetheless, two related disadvantages exist accordingly: 1) the performance of feature selection highly depends on the constructed Laplacian matrix and 2) the pseudo labels are obtained with mixed signs, while the real ones should be nonnegative. To address this problem, a novel approach for unsupervised feature selection is proposed by extending orthogonal least square discriminant analysis (OLSDA) to the unsupervised case, such that nonnegative pseudo labels can be achieved. Additionally, an orthogonal constraint is imposed on the class indicator to hold the manifold structure. Furthermore, $\ell _{2,1}$ regularization is imposed to ensure that the projection matrix is row sparse for efficient feature selection and proved to be equivalent to $\ell _{2,0}$ regularization. Finally, extensive experiments on nine benchmark data sets are conducted to demonstrate the effectiveness of the proposed approach. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 2162237X
- Volume :
- 33
- Issue :
- 5
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Neural Networks & Learning Systems
- Publication Type :
- Periodical
- Accession number :
- 156718264
- Full Text :
- https://doi.org/10.1109/TNNLS.2020.3045053