Back to Search Start Over

Few-Shot Class-Incremental Learning with Prior Knowledge

Authors :
Jiang, Wenhao
Li, Duo
Hu, Menghan
Zhai, Guangtao
Yang, Xiaokang
Zhang, Xiao-Ping
Publication Year :
2024

Abstract

To tackle the issues of catastrophic forgetting and overfitting in few-shot class-incremental learning (FSCIL), previous work has primarily concentrated on preserving the memory of old knowledge during the incremental phase. The role of pre-trained model in shaping the effectiveness of incremental learning is frequently underestimated in these studies. Therefore, to enhance the generalization ability of the pre-trained model, we propose Learning with Prior Knowledge (LwPK) by introducing nearly free prior knowledge from a few unlabeled data of subsequent incremental classes. We cluster unlabeled incremental class samples to produce pseudo-labels, then jointly train these with labeled base class samples, effectively allocating embedding space for both old and new class data. Experimental results indicate that LwPK effectively enhances the model resilience against catastrophic forgetting, with theoretical analysis based on empirical risk minimization and class distance measurement corroborating its operational principles. The source code of LwPK is publicly available at: \url{https://github.com/StevenJ308/LwPK}.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.01201
Document Type :
Working Paper