1. Unsupervised Feature Selection With Extended OLSDA via Embedding Nonnegative Manifold Structure.
- Author
-
Zhang, Rui, Zhang, Hongyuan, Li, Xuelong, and Yang, Sheng
- Subjects
- *
LAPLACIAN matrices , *FEATURE selection , *DISCRIMINANT analysis , *LEAST squares , *SPARSE matrices , *LINEAR programming - Abstract
As to unsupervised learning, most discriminative information is encoded in the cluster labels. To obtain the pseudo labels, unsupervised feature selection methods usually utilize spectral clustering to generate them. Nonetheless, two related disadvantages exist accordingly: 1) the performance of feature selection highly depends on the constructed Laplacian matrix and 2) the pseudo labels are obtained with mixed signs, while the real ones should be nonnegative. To address this problem, a novel approach for unsupervised feature selection is proposed by extending orthogonal least square discriminant analysis (OLSDA) to the unsupervised case, such that nonnegative pseudo labels can be achieved. Additionally, an orthogonal constraint is imposed on the class indicator to hold the manifold structure. Furthermore, $\ell _{2,1}$ regularization is imposed to ensure that the projection matrix is row sparse for efficient feature selection and proved to be equivalent to $\ell _{2,0}$ regularization. Finally, extensive experiments on nine benchmark data sets are conducted to demonstrate the effectiveness of the proposed approach. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF