1. Span-Level Dual-Encoder Model for Aspect Sentiment Triplet Extraction
- Author
-
ZHANG Yunqi, LI Songda, LAN Yuquan, LI Dongxu, ZHAO Hui
- Subjects
sentiment analysis ,aspect sentiment triplet extraction (aste) ,pipeline model ,span ,independent encoders ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Aspect sentiment triplet extraction (ASTE) is one of the subtasks of aspect-based sentiment analysis, which aims to identify all aspect terms, their corresponding opinion terms and sentiment polarities in sentences. Currently, pipeline or end-to-end models are adopted to accomplish the ASTE task. The former cannot solve the overlapping problem of aspect terms in triplets and ignores the dependency between opinion terms and sentiment polarities. The latter divides the ASTE task into two subtasks of aspect-opinion-extraction and sentiment-polarity-classification, which applies multi-task learning through a shared encoder. However, this setting does not distinguish the differences between the features of the two subtasks, leading to the feature confusion problem. SD-ASTE (span-level dual-encoder model for ASTE), a pipeline model with two modules, is proposed to address the above problems. The first module extracts aspect terms and opinion terms based on spans. The span feature representation incor-porates span head, tail and length information to focus on the boundary information of aspect terms and opinion terms. The second module judges the sentiment polarities expressed by aspect-opinion span pairs. The span-pair feature representation is based on levitated markers to focus on the dependency among triplet elements. The model utilizes two independent encoders to extract different features for each module. Comparative experimental results on multiple datasets show that the model is superior to the state-of-the-art pipeline and end-to-end models. Validity experiments show the effectiveness of the span feature representation, span-pair feature representation and the two independent encoders.
- Published
- 2023
- Full Text
- View/download PDF