Back to Search Start Over

PLOP: Learning without Forgetting for Continual Semantic Segmentation

Authors :
Arnaud Dapogny
Yifu Chen
Arthur Douillard
Matthieu Cord
Heuritech
Machine Learning and Information Access (MLIA)
LIP6
Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS)-Sorbonne Université (SU)-Centre National de la Recherche Scientifique (CNRS)
Datakalab
Douillard, Arthur
Source :
CVPR, CVPR, Jun 2021, Nashville, United States
Publication Year :
2021
Publisher :
HAL CCSD, 2021.

Abstract

Deep learning approaches are nowadays ubiquitously used to tackle computer vision tasks such as semantic segmentation, requiring large datasets and substantial computational power. Continual learning for semantic segmentation (CSS) is an emerging trend that consists in updating an old model by sequentially adding new classes. However, continual learning methods are usually prone to catastrophic forgetting. This issue is further aggravated in CSS where, at each step, old classes from previous iterations are collapsed into the background. In this paper, we propose Local POD, a multi-scale pooling distillation scheme that preserves long- and short-range spatial relationships at feature level. Furthermore, we design an entropy-based pseudo-labelling of the background w.r.t. classes predicted by the old model to deal with background shift and avoid catastrophic forgetting of the old classes. Our approach, called PLOP, significantly outperforms state-of-the-art methods in existing CSS scenarios, as well as in newly proposed challenging benchmarks.<br />Comment: Accepted at CVPR 2021, code: https://github.com/arthurdouillard/CVPR2021_PLOP

Details

Language :
English
Database :
OpenAIRE
Journal :
CVPR, CVPR, Jun 2021, Nashville, United States
Accession number :
edsair.doi.dedup.....455e808538af4df1c53354b3b0258d71