Back to Search Start Over

Position-aware hierarchical transfer model for aspect-level sentiment classification.

Authors :
Zhou, Jie
Chen, Qin
Huang, Jimmy Xiangji
Hu, Qinmin Vivian
He, Liang
Source :
Information Sciences. Mar2020, Vol. 513, p1-16. 16p.
Publication Year :
2020

Abstract

Recently, attention-based neural networks (NNs) have been widely used for aspect-level sentiment classification (ASC). Most neural models focus on incorporating the aspect representation into attention, however, the position information of each aspect is not studied well. Furthermore, the existing ASC datasets are relatively small owing to the labor-intensive labeling that largely limits the performance of NNs. In this paper, we propose a position-aware hierarchical transfer (PAHT) model that models the position information from multiple levels and enhances the ASC performance by transferring hierarchical knowledge from the resource-rich sentence-level sentiment classification (SSC) dataset. We first present aspect-based positional attention in the word and the segment levels to capture more salient information toward a given aspect. To make up for the limited data for ASC, we devise three sampling strategies to select related instances from the large-scale SSC dataset for pre-training and transfer the learned knowledge into ASC from four levels: embedding, word, segment and classifier. Extensive experiments on four benchmark datasets demonstrate that the proposed model is effective in improving the performance of ASC. Particularly, our model outperforms the state-of-the-art approaches in terms of accuracy over all the datasets considered. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
513
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
141111710
Full Text :
https://doi.org/10.1016/j.ins.2019.11.048