1. Unsupervised feature selection algorithm based on redundancy learning and sparse regression.
- Author
-
Kong, Guoping, Ma, Yingcang, Xing, Zhiwei, and Xin, Xiaolong
- Subjects
- *
FEATURE selection , *ALGORITHMS , *REGRESSION analysis , *DATA analysis - Abstract
In recent years, feature selection methods based on sparse regression have attracted much attention from researchers, and how to select more representative feature is the key point. In this paper, an unsupervised feature selection method based on redundancy learning and sparse regression(RSUFS) is proposed. Firstly, to make the model robust to outliers, this paper uses the l 2 , 1 -norm regression model as the loss function to learn the feature weight matrix. Secondly, in order to get exact k top features, l 2 , 0 -norm constraint is introduced. At the same time, the cosine similarity between features is taken into account to select more valuable features by reducing the redundancy between features. Finally, an efficient algorithm based on Augmented Lagrangian method is derived to solve the above optimization problem. Comparison experiments are made with some benchmark datasets and seven well-known unsupervised feature selection algorithms and the results show that the given algorithm is effective. • l 2 , 1 -norm regression is used as the loss function to effectively avoid the influence of outliers. • Pseudo-labels are constructed through spectral analysis to preserve the data geometry. • The original sparse problem is dealt with directly by the l 2 , 0 -norm constraint. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF