Back to Search Start Over

Supervised Contrastive Replay: Revisiting the Nearest Class Mean Classifier in Online Class-Incremental Continual Learning

Authors :
Zheda Mai
Hyunwoo Kim
Ruiwen Li
Scott Sanner
Source :
CVPR Workshops
Publication Year :
2021
Publisher :
arXiv, 2021.

Abstract

Online class-incremental continual learning (CL) studies the problem of learning new classes continually from an online non-stationary data stream, intending to adapt to new data while mitigating catastrophic forgetting. While memory replay has shown promising results, the recency bias in online learning caused by the commonly used Softmax classifier remains an unsolved challenge. Although the Nearest-Class-Mean (NCM) classifier is significantly undervalued in the CL community, we demonstrate that it is a simple yet effective substitute for the Softmax classifier. It addresses the recency bias and avoids structural changes in the fully-connected layer for new classes. Moreover, we observe considerable and consistent performance gains when replacing the Softmax classifier with the NCM classifier for several state-of-the-art replay methods. To leverage the NCM classifier more effectively, data embeddings belonging to the same class should be clustered and well-separated from those with a different class label. To this end, we contribute Supervised Contrastive Replay (SCR), which explicitly encourages samples from the same class to cluster tightly in embedding space while pushing those of different classes further apart during replay-based training. Overall, we observe that our proposed SCR substantially reduces catastrophic forgetting and outperforms state-of-the-art CL methods by a significant margin on a variety of datasets.<br />Comment: In Workshop on Continual Learning in Computer Vision at CVPR 2021

Details

Database :
OpenAIRE
Journal :
CVPR Workshops
Accession number :
edsair.doi.dedup.....d24056469813826562b804f8c7d9183c
Full Text :
https://doi.org/10.48550/arxiv.2103.13885