1. 基于改进注意力迁移的实时目标检测方法.
- Author
-
张弛 and 刘宏哲
- Subjects
- *
DEEP learning , *DISTILLATION , *MACHINE learning , *MEMORY , *CLASSIFICATION - Abstract
Recently, deep neural networks need to be deployed with low memory and computing resources, so it is necessary to design an efficient and compact network structure. This paper proposed a model compression method (KE ) based on improved attention transfer for the des ign of compact neural networks, which mainly used a wide residual teacher network (WRN) to guide a compact student network (KENet) by extracting both spatial and channel-wise attention to improve the performance, and applied this method to real-time object detection. The image classification experiment on CIFAR verifies that the knowledge distillation method with improved attention transfer can improve the performance of the compact model. The object detection experiment on VOC verifies that the model KEDet has good accuracy (72. 7 mAP) and time performance (86 fps) . The experimental results show that the object detection model based on improved attention transfer has good accuracy and real-time performance. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF