Back to Search Start Over

Light-weight shadow detection via GCN-based annotation strategy and knowledge distillation.

Authors :
Wu, Wen
Zhou, Kai
Chen, Xiao-Diao
Yong, Jun-Hai
Source :
Computer Vision & Image Understanding; Feb2022, Vol. 216, pN.PAG-N.PAG, 1p
Publication Year :
2022

Abstract

This paper discusses shadow detection problem, and proposes a light-weight network to achieve both accurate detection results and high computation efficiency. Firstly, we begin by presenting a compact network for real-time shadow detection. Secondly, to improve the performance of our light-weight networks, we propose two complementary and necessary strategies, i.e. , the use of extra training data and knowledge distillation. Note that collecting a large amount of extra data will lead to the following challenge: shadow scenes is various, while annotating for those complex scenarios is time-consuming and expensive, sometimes even need expert help. To solve it, in the first step, we introduce a novel shadow annotation strategy based on graph convolutional networks, namely Anno-GCN, to provide extra training pairs, which obtains a complete shadow mask via only several annotation scribbles. In the second step, we can combine knowledge distillation with these sufficient GCN-labeled training data to further improve the performance of the light-weight network. Extensive experiments demonstrate that our method can achieve a state-of-the-art inference accuracy, computational efficiency, and generalizability with only about 2.97 M parameters. • A light-weight network is proposed to obtain a shadow mask in real-time. • We present a novel annotation strategy that can generate a complete shadow mask from only a few scribbles. • By using extra GCN-labeled data and knowledge distillation, we can improve the performance of our light-weight model. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10773142
Volume :
216
Database :
Supplemental Index
Journal :
Computer Vision & Image Understanding
Publication Type :
Academic Journal
Accession number :
155056497
Full Text :
https://doi.org/10.1016/j.cviu.2021.103341