Back to Search Start Over

TAE: Topic-aware encoder for large-scale multi-label text classification.

Authors :
Qin, Shaowei
Wu, Hao
Zhou, Lihua
Zhao, Yiji
Zhang, Lei
Source :
Applied Intelligence; Apr2024, Vol. 54 Issue 8, p6269-6284, 16p
Publication Year :
2024

Abstract

Convolutional neural networks, recurrent neural networks, and transformers have excelled in representation learning for large-scale multi-label text classification. However, there have been very few works that have incorporated topic information in the process of encoding textual sequential semantics, partly because the text's topic needs to be modeled separately. To address this, we introduce the latent topic-aware encoder (TAE), designed for large-scale multi-label text classification. The TAE features two key components: a latent topic attention module that correlates latent topic vectors with word hidden vectors and a topic-fused channel attention module that processes topic-specific text representations to produce a refined final text representation. Our experiments demonstrate that TAE seamlessly integrates with existing deep models, significantly enhancing their classification accuracy and convergence speed across various datasets. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0924669X
Volume :
54
Issue :
8
Database :
Complementary Index
Journal :
Applied Intelligence
Publication Type :
Academic Journal
Accession number :
177897422
Full Text :
https://doi.org/10.1007/s10489-024-05485-z