Back to Search
Start Over
Joint Discriminative Latent Subspace Learning for Image Classification.
- Source :
-
IEEE Transactions on Circuits & Systems for Video Technology . Jul2022, Vol. 32 Issue 7, p4653-4666. 14p. - Publication Year :
- 2022
-
Abstract
- Latent subspace learning aims to produce a latent representation for better reconstruction and classification from high-dimensional data through exploiting the optimal subspace. Current latent subspace learning methods commonly have three problems: 1) The discriminative property is ignored when learning the latent subspace, 2) The redundancy exists between the latent subspace and the prediction space, 3) There is no unified latent subspace that exploits knowledge jointly from the raw space, latent subspace, and label space. In this paper, we formulate the Joint Discriminative Latent Subspace Learning (JDLSL) problem to address these issues, and provide its optimization solution. JDLSL learns image representation from two aspects: a) the joint learning of latent subspaces for data reconstruction and prediction, b) the joint learning of label space and latent subspace for data reconstruction. To integrate knowledge from the joint learning, we organize the sparsity-induced latent subspace, where row-sparsity and column sparsity are simultaneously imposed. We provide the theoretical proof for the discriminativity learning ability of the sparsity-induced latent subspace. Extensive experiments and comparisons with the state-of-the-art showed that the proposed method has better performance. JDLSL shows a competitive performance with deep features compared to deep learning architectures, reflecting it potential integrating with deep learning. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 10518215
- Volume :
- 32
- Issue :
- 7
- Database :
- Academic Search Index
- Journal :
- IEEE Transactions on Circuits & Systems for Video Technology
- Publication Type :
- Academic Journal
- Accession number :
- 157765786
- Full Text :
- https://doi.org/10.1109/TCSVT.2021.3135316