1. Structured Knowledge Distillation for Dense Prediction
- Author
-
Yifan Liu, Chunhua Shen, Changyong Shu, and Jingdong Wang
- Subjects
FOS: Computer and information sciences ,Computer science ,Computer Vision and Pattern Recognition (cs.CV) ,Knowledge engineering ,Computer Science - Computer Vision and Pattern Recognition ,ComputingMilieux_LEGALASPECTSOFCOMPUTING ,02 engineering and technology ,computer.software_genre ,law.invention ,Artificial Intelligence ,law ,0202 electrical engineering, electronic engineering, information engineering ,Structured prediction ,Distillation ,Computer Science::Cryptography and Security ,Contextual image classification ,business.industry ,Computer Science::Information Retrieval ,Applied Mathematics ,Quantum Physics ,Image segmentation ,Object detection ,Computational Theory and Mathematics ,Graph (abstract data type) ,020201 artificial intelligence & image processing ,Computer Vision and Pattern Recognition ,Data mining ,Artificial intelligence ,business ,computer ,Software - Abstract
In this work, we consider transferring the structure information from large networks to compact ones for dense prediction tasks in computer vision. Previous knowledge distillation strategies used for dense prediction tasks often directly borrow the distillation scheme for image classification and perform knowledge distillation for each pixel separately, leading to sub-optimal performance. Here we propose to distill structured knowledge from large networks to compact networks, taking into account the fact that dense prediction is a structured prediction problem. Specifically, we study two structured distillation schemes: i) pair-wise distillation that distills the pair-wise similarities by building a static graph; and ii) holistic distillation that uses adversarial training to distill holistic knowledge. The effectiveness of our knowledge distillation approaches is demonstrated by experiments on three dense prediction tasks: semantic segmentation, depth estimation and object detection. Code is available at: https://git.io/StructKD, Comment: v1:10 pages cvpr2019 accepted; v2:15 pages for a journal version; Code is available at: https://github.com/irfanICMLL/structure_knowledge_distillation; fix typos
- Published
- 2023
- Full Text
- View/download PDF