Back to Search
Start Over
Towards prior gap and representation gap for long-tailed recognition.
- Source :
-
Pattern Recognition . Jan2023, Vol. 133, pN.PAG-N.PAG. 1p. - Publication Year :
- 2023
-
Abstract
- • A unified theoretical framework for long-tailed recognition is established. • Corresponding mitigation solutions for prior gap and representation gap are proposed. • Theoretically analyzing the existing methods and the proposed methods in terms of the impact on two gaps. • The proposed methods yield superior performance on five long-tailed benchmarks. Most deep learning models are elaborately designed for balanced datasets, and thus they inevitably suffer performance degradation in practical long-tailed recognition tasks, especially to the minority classes. There are two crucial issues in learning from imbalanced datasets: skew decision boundary and unrepresentative feature space. In this work, we establish a theoretical framework to analyze the sources of these two issues from Bayesian perspective, and find that they are closely related to the prior gap and the representation gap, respectively. Under this framework, we show that existing long-tailed recognition methods manage to remove either the prior gap or the presentation gap. Different from these methods, we propose to simultaneously remove the two gaps to achieve more accurate long-tailed recognition. Specifically, we propose the prior calibration strategy to remove the prior gap and introduce three strategies (representative feature extraction, optimization strategy adjustment and effective sample modeling) to mitigate the representation gap. Extensive experiments on five benchmark datasets validate the superiority of our method against the state-of-the-art competitors. [ABSTRACT FROM AUTHOR]
- Subjects :
- *DEEP learning
*FEATURE extraction
*IMAGE recognition (Computer vision)
Subjects
Details
- Language :
- English
- ISSN :
- 00313203
- Volume :
- 133
- Database :
- Academic Search Index
- Journal :
- Pattern Recognition
- Publication Type :
- Academic Journal
- Accession number :
- 159570121
- Full Text :
- https://doi.org/10.1016/j.patcog.2022.109012