Back to Search
Start Over
SSAG-Net: Syntactic and Semantic Attention-Guided Machine Reading Comprehension.
- Source :
- Intelligent Automation & Soft Computing; 2022, Vol. 34 Issue 3, p2023-2034, 12p
- Publication Year :
- 2022
-
Abstract
- Machine reading comprehension (MRC) is a task in natural language comprehension. It assesses machine reading comprehension based on text reading and answering questions. Traditional attention methods typically focus on one of syntax or semantics, or integrate syntax and semantics through a manual method, leaving the model unable to fully utilize syntax and semantics for MRC tasks. In order to better understand syntactic and semantic information and improve machine reading comprehension, our study uses syntactic and semantic attention to conduct text modeling for tasks. Based on the BERT model of Transformer encoder, we separate a text into two branches: syntax part and semantics part. In syntactic component, an attention model with explicit syntactic constraints is linked with a self-attention model of context. In semantics component, after the framework semantic parsing, the lexical unit attention model is utilized to process the text in the semantic part. Finally, the vectors of the two branches converge into a new vector. And it can make answer predictions based on different types of data. Thus, a syntactic and semantic attention-guided machine reading comprehension (SSAG-Net) is formed. To test the model's validity, we ran it through two MRC tasks on SQuAD 2.0 and MCTest, and the SSAG-Net model outperformed the baseline model in both. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 10798587
- Volume :
- 34
- Issue :
- 3
- Database :
- Complementary Index
- Journal :
- Intelligent Automation & Soft Computing
- Publication Type :
- Academic Journal
- Accession number :
- 157580882
- Full Text :
- https://doi.org/10.32604/iasc.2022.029447