Back to Search
Start Over
Semi-supervised transformable architecture search for feature distillation.
- Source :
-
Pattern Analysis & Applications . May2023, Vol. 26 Issue 2, p669-677. 9p. - Publication Year :
- 2023
-
Abstract
- The designed method aims to perform image classification tasks efficiently and accurately. Different from the traditional CNN-based image classification methods, which are greatly affected by the number of labels and the depth of the network. Although the deep network can improve the accuracy of the model, the training process is usually time-consuming and laborious. We explained how to use only a few of labels, design a more flexible network architecture and combine feature distillation method to improve model efficiency while ensuring high accuracy. Specifically, we integrate different network structures into independent individuals to make the use of network structures more flexible. Based on knowledge distillation, we extract the channel features and establish a feature distillation connection from the teacher network to the student network. By comparing the experimental results with other related popular methods on commonly used data sets, the effectiveness of the method is proved. The code can be found at https://github.com/ZhangXinba/Semi_FD. [ABSTRACT FROM AUTHOR]
- Subjects :
- *IMAGE recognition (Computer vision)
*FLEXIBLE structures
Subjects
Details
- Language :
- English
- ISSN :
- 14337541
- Volume :
- 26
- Issue :
- 2
- Database :
- Academic Search Index
- Journal :
- Pattern Analysis & Applications
- Publication Type :
- Academic Journal
- Accession number :
- 163122007
- Full Text :
- https://doi.org/10.1007/s10044-022-01122-y