Back to Search Start Over

Maximal activation weighted memory for aspect based sentiment analysis.

Authors :
Mokhosi, Refuoe
Shikali, Casper
Qin, Zhiguang
Liu, Qiao
Source :
Computer Speech & Language. Nov2022, Vol. 76, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

The vast diffusion of social networks has made an unprecedented amount of user-generated data available, increasing the importance of Aspect Based Sentiment Analysis(ABSA) when extracting sentiment polarity. Although recent research efforts favor the use of self attention networks to solve the ABSA task, they still face difficulty in extracting long distance relations between non-adjacent words, especially when a sentence has more than one aspect. We propose the BERT-MAM model which approaches the ABSA task as a memory activation process regulated by memory decay and word similarity, implying that the importance of a word decays over time until it is reactivated by a similarity boost. We base experiments on the less commonly used Bidirectional Encoder Representations from Transformers (BERT), to achieve competitive results in the Laptop and Restaurant datasets. • Aspect Based Sentiment Analysis. • Activation weighted memory. • Bidirectional Encoder Representations from Transformers. • Memory decay. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08852308
Volume :
76
Database :
Academic Search Index
Journal :
Computer Speech & Language
Publication Type :
Academic Journal
Accession number :
157301347
Full Text :
https://doi.org/10.1016/j.csl.2022.101402