1. Siamese labels auxiliary learning.
- Author
-
Gan, Wenrui, Liu, Zhulin, Chen, C.L. Philip, and Zhang, Tong
- Subjects
- *
DEEP learning , *PERFORMANCE standards , *LEARNING , *CLASSIFICATION - Abstract
In deep learning, auxiliary modules for model training have become increasingly popular, such as Deep Mutual Learning (DML) and Multi-Scale Dense Convolutional Networks (MSDNet), which can maximize the performance of the model without increasing the amount of test computation. Nevertheless, current research does not fully exploit the knowledge between different auxiliary modules. This paper proposes a new model training technique–Siamese Labels Auxiliary (SiLA) Learning, in which the SiLA module is designed to concatenate the outputs of the modules to get auxiliary information. In addition, the knowledge learned from different modules can be fully exploited by auxiliary information. Our experiments show that SiLA Learning effectively improves the performance of standard models and achieves convincing experimental results on image classification tasks without increasing the amount of test computation. Moreover, the experiments show that SiLA Learning can be easily combined with methods such as DML and MSDNet to exploit the knowledge fully and perform best. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF