Back to Search
Start Over
Tree Framework With BERT Word Embedding for the Recognition of Chinese Implicit Discourse Relations
- Source :
- IEEE Access, Vol 8, Pp 162004-162011 (2020)
- Publication Year :
- 2020
- Publisher :
- IEEE, 2020.
-
Abstract
- Currently, discourse relation recognition (DRR), which is not directly marked with connectives, is a challenging task. Traditional approaches for implicit DRR in Chinese have focused on exploring the concepts and features of words; however, these approaches have only yielded slow progress. Moreover, the lack of Chinese labeled data makes it more difficult to complete this task with high accuracy. To address this issue, we propose a novel hybrid DRR model combining a pretrained language model, namely bidirectional encoder representations from transformers (BERT), with recurrent neural networks. We use BERT as a text representation and pretraining model. In addition, we apply a tree structure to the implicit DRR in Chinese to produce hierarchical classes. The 19-class F1 score of our proposed method can reach 74.47% on the HIT-CIR Chinese discourse relation corpus. The attained results showed that the use of BERT and the proposed tree structure forms a novel and precise method that can automatically recognize the implicit relations of Chinese discourse.
- Subjects :
- Discourse relation
Word embedding
General Computer Science
Computer science
business.industry
General Engineering
deep neural network
computer.software_genre
Tree (data structure)
Discourse relation recognition
Recurrent neural network
Tree structure
General Materials Science
Language model
Artificial intelligence
lcsh:Electrical engineering. Electronics. Nuclear engineering
F1 score
Representation (mathematics)
business
tree structure
computer
lcsh:TK1-9971
Natural language processing
BERT
Subjects
Details
- Language :
- English
- ISSN :
- 21693536
- Volume :
- 8
- Database :
- OpenAIRE
- Journal :
- IEEE Access
- Accession number :
- edsair.doi.dedup.....3e75bd267af091f05acc2bddd3553bde