Back to Search Start Over

Feature selection via Non-convex constraint and latent representation learning with Laplacian embedding.

Authors :
Shang, Ronghua
Kong, Jiarui
Feng, Jie
Jiao, Licheng
Source :
Expert Systems with Applications. Dec2022, Vol. 208, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

• Calculate the similarity between pseudo-labels to keep complete sample information. • The latent representation space is learned in latent feature space and data space. • The latent representation contains the correlation of pseudo-labels. • Latent feature space provides discriminative information to guide feature selection. • Impose non-negative and l 2,1-2 -norm non-convex constraint on Q. In unsupervised feature selection, the relationship between pseudo-labels is often ignored, and the interconnection information between the data is not fully utilized. In order to solve these problems, this paper proposes a feature selection method via non-convex constraint and latent representation learning with Laplacian embedding (NLRL-LE). NLRL-LE keeps the correlation between the pseudo-labels to make the pseudo-label closer to the true label. And it combines with the interconnection information between data, learns the latent representation matrix to guide feature selection. Specifically, first, NLRL-LE regards each pseudo-label as a latent feature of the sample, constructs a latent feature graph, and retains the inherent attributes of the pseudo-labels. Second, latent representation learning is performed in the space which is made up of the latent feature space and data space. Since the latent feature graph retains the correlation between pseudo-labels, latent representation learning considers the interconnection information between data, and the information contained in the latent representation space is more complete. In addition, in order to make full use of pseudo-labels, the learned latent representation matrix is used as pseudo-label information to provide cluster labels in the latent representation space to guide feature selection. Finally, non-negative and l 2,1-2 -norm non-convex constraint are applied to the feature transformation matrix. The combination of non-negative constraint and non-convex constraint, compared with convex constraint, can ensure the row sparsity of the feature transformation matrix, select low-redundant features, and improve the feature selection effect. The experimental results show that the ACC and NMI of the NLRL-LE are better than the other seven compared algorithms on twelve datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
208
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
158911452
Full Text :
https://doi.org/10.1016/j.eswa.2022.118179