Back to Search Start Over

Siamese Labels Auxiliary Learning

Authors :
Gan, Wenrui
Liu, Zhulin
Chen, C. L. Philip
Zhang, Tong
Publication Year :
2021

Abstract

In deep learning, auxiliary training has been widely used to assist the training of models. During the training phase, using auxiliary modules to assist training can improve the performance of the model. During the testing phase, auxiliary modules can be removed, so the test parameters are not increased. In this paper, we propose a novel auxiliary training method, Siamese Labels Auxiliary Learning (SiLa). Unlike Deep Mutual Learning (DML), SiLa emphasizes auxiliary learning and can be easily combined with DML. In general, the main work of this paper include: (1) propose SiLa Learning, which improves the performance of common models without increasing test parameters; (2) compares SiLa with DML and proves that SiLa can improve the generalization of the model; (3) SiLa is applied to Dynamic Neural Networks, and proved that SiLa can be used for various types of network structures.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2103.00200
Document Type :
Working Paper