Back to Search
Start Over
A Unified Asymmetric Knowledge Distillation Framework for Image Classification.
- Source :
- Neural Processing Letters; Jun2024, Vol. 56 Issue 3, p1-25, 25p
- Publication Year :
- 2024
-
Abstract
- Knowledge distillation is a model compression technique that transfers knowledge learned by teacher networks to student networks. Existing knowledge distillation methods greatly expand the forms of knowledge, but also make the distillation models complex and symmetric. However, few studies have explored the commonalities among these methods. In this study, we propose a concise distillation framework to unify these methods and a method to construct asymmetric knowledge distillation under the framework. Asymmetric distillation aims to enable differentiated knowledge transfers for different distillation objects. We designed a multi-stage shallow-wide branch bifurcation method to distill different knowledge representations and a grouping ensemble strategy to supervise the network to teach and learn selectively. Consequently, we conducted experiments using image classification benchmarks to verify the proposed method. Experimental results show that our implementation can achieve considerable improvements over existing methods, demonstrating the effectiveness of the method and the potential of the framework. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 13704621
- Volume :
- 56
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- Neural Processing Letters
- Publication Type :
- Academic Journal
- Accession number :
- 177211792
- Full Text :
- https://doi.org/10.1007/s11063-024-11606-z