Back to Search Start Over

Learning metric space with distillation for large-scale multi-label text classification.

Authors :
Qin, Shaowei
Wu, Hao
Zhou, Lihua
Li, Jiahui
Du, Guowang
Source :
Neural Computing & Applications; May2023, Vol. 35 Issue 15, p11445-11458, 14p
Publication Year :
2023

Abstract

Deep neural network-based methods have achieved outstanding results in the task of text classification. However, the relationship of text–label and label–label has not been thoroughly investigated for most existing methods. Furthermore, these methods have excessive computational and memory overhead for large-scale classification. To address these challenges, we propose a novel framework with metric learning and knowledge distillation. We first project the texts and labels into the same embedding space by utilizing the symmetry metric learning on both text–centric and label–centric relationships. Then the distillation component is introduced to learn the text representation features with a deep module. Finally, we use this distilled module to encode new text and make predictions with label embeddings in the metric space. Experimental results on four real datasets show that our model achieves very competitive prediction accuracy while improving training and prediction efficiency. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
35
Issue :
15
Database :
Complementary Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
163294503
Full Text :
https://doi.org/10.1007/s00521-023-08308-3