Back to Search Start Over

Few-Shot Class-Incremental Learning via Training-Free Prototype Calibration

Authors :
Wang, Qi-Wei
Zhou, Da-Wei
Zhang, Yi-Kai
Zhan, De-Chuan
Ye, Han-Jia
Publication Year :
2023

Abstract

Real-world scenarios are usually accompanied by continuously appearing classes with scare labeled samples, which require the machine learning model to incrementally learn new classes and maintain the knowledge of base classes. In this Few-Shot Class-Incremental Learning (FSCIL) scenario, existing methods either introduce extra learnable components or rely on a frozen feature extractor to mitigate catastrophic forgetting and overfitting problems. However, we find a tendency for existing methods to misclassify the samples of new classes into base classes, which leads to the poor performance of new classes. In other words, the strong discriminability of base classes distracts the classification of new classes. To figure out this intriguing phenomenon, we observe that although the feature extractor is only trained on base classes, it can surprisingly represent the semantic similarity between the base and unseen new classes. Building upon these analyses, we propose a simple yet effective Training-frEE calibratioN (TEEN) strategy to enhance the discriminability of new classes by fusing the new prototypes (i.e., mean features of a class) with weighted base prototypes. In addition to standard benchmarks in FSCIL, TEEN demonstrates remarkable performance and consistent improvements over baseline methods in the few-shot learning scenario. Code is available at: https://github.com/wangkiw/TEEN<br />Comment: Accepted to NeurIPS 2023. Code is available at: https://github.com/wangkiw/TEEN

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.05229
Document Type :
Working Paper