Back to Search
Start Over
Feature selection and cascade dimensionality reduction for self-supervised visual representation learning.
- Source :
-
Computers & Electrical Engineering . Mar2023, Vol. 106, pN.PAG-N.PAG. 1p. - Publication Year :
- 2023
-
Abstract
- Self-supervised visual representation learning focuses on capturing comprehensive features via exploiting the unlabeled datasets. However, existing contrastive learning based SSL frameworks are subjected to higher computational consumption and unsatisfactory performance. To handle these issues, we present a novel single-branch SSL method that incorporates an adaptive feature selection and activation module and a progressive cascade dimensionality reduction module, called APNet. Specifically, our method first fully exploits the unlabeled datasets and extracts intra- and inter-image information via introducing montage image. In addition, a novel adaptive feature selection and activation module is designed to generate the most comprehensive features. Besides, a progressive cascade dimensionality reduction module is proposed to capture the most representative features from latent vectors through cascade dimensionality increasing–decreasing operations. Extensive experiments have demonstrated the robustness and effectiveness of APNet. Specifically, APNet exceeds MoCo-v3 by 3.1% on the ImageNet-100 dataset, and consumes only half of the calculation. Code is available at https://github.com/AI-TYQ/APNet. • We embed feature selection and dimensionality reduction modules into SSL framework. • We design an attention module that selects and activates representative features. • We introduce a dimensionality reduction module to retain discriminative features. • We demonstrate the advantages of the above contributions through experiments. [Display omitted] [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 00457906
- Volume :
- 106
- Database :
- Academic Search Index
- Journal :
- Computers & Electrical Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 161844121
- Full Text :
- https://doi.org/10.1016/j.compeleceng.2022.108570