Back to Search Start Over

Enhancing BERT Representation With Context-Aware Embedding for Aspect-Based Sentiment Analysis

Authors :
Xinlong Li
Xingyu Fu
Guangluan Xu
Yang Yang
Jiuniu Wang
Li Jin
Qing Liu
Tianyuan Xiang
Source :
IEEE Access, Vol 8, Pp 46868-46876 (2020)
Publication Year :
2020
Publisher :
IEEE, 2020.

Abstract

Aspect-based sentiment analysis, which aims to predict the sentiment polarities for the given aspects or targets, is a broad-spectrum and challenging research area. Recently, pre-trained models, such as BERT, have been used in aspect-based sentiment analysis. This fine-grained task needs auxiliary information to distinguish each aspect. But the input form of BERT is only a words sequence which can not provide extra contextual information. To address this problem, we introduce a new method named GBCN which uses a gating mechanism with context-aware aspect embeddings to enhance and control the BERT representation for aspect-based sentiment analysis. Firstly, the input texts are fed into BERT and context-aware embedding layer to generate BERT representation and refined context-aware embeddings separately. These refined embeddings contain the most correlated information selected in the context. Then, we employ a gating mechanism to control the propagation of sentiment features from BERT output with context-aware embeddings. The experiments of our model obtain new state-of-the-art results on the SentiHood and SemEval-2014 datasets, achieving a test F1 of 88.0 and 92.9 respectively.

Details

Language :
English
ISSN :
21693536
Volume :
8
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.2fffd76169c546eabc8122fa27910f23
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2020.2978511