Back to Search Start Over

Class Incremental Learning with Self-Supervised Pre-Training and Prototype Learning

Authors :
Liu, Wenzhuo
Wu, Xinjian
Zhu, Fei
Yu, Mingming
Wang, Chuang
Liu, Cheng-Lin
Publication Year :
2023

Abstract

Deep Neural Network (DNN) has achieved great success on datasets of closed class set. However, new classes, like new categories of social media topics, are continuously added to the real world, making it necessary to incrementally learn. This is hard for DNN because it tends to focus on fitting to new classes while ignoring old classes, a phenomenon known as catastrophic forgetting. State-of-the-art methods rely on knowledge distillation and data replay techniques but still have limitations. In this work, we analyze the causes of catastrophic forgetting in class incremental learning, which owes to three factors: representation drift, representation confusion, and classifier distortion. Based on this view, we propose a two-stage learning framework with a fixed encoder and an incrementally updated prototype classifier. The encoder is trained with self-supervised learning to generate a feature space with high intrinsic dimensionality, thus improving its transferability and generality. The classifier incrementally learns new prototypes while retaining the prototypes of previously learned data, which is crucial in preserving the decision boundary.Our method does not rely on preserved samples of old classes, is thus a non-exemplar based CIL method. Experiments on public datasets show that our method can significantly outperform state-of-the-art exemplar-based methods when they reserved 5 examplers per class, under the incremental setting of 10 phases, by 18.24% on CIFAR-100 and 9.37% on ImageNet100.<br />Comment: This paper has been under review by a journal since 19-Apr-2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2308.02346
Document Type :
Working Paper