Back to Search Start Over

Chinese named entity recognition model based on BERT

Authors :
Yuanyuan Zheng
Hongshuai Liu
Ge Jun
Source :
MATEC Web of Conferences, Vol 336, p 06021 (2021)
Publication Year :
2021
Publisher :
EDP Sciences, 2021.

Abstract

Nowadays, most deep learning models ignore Chinese habits and global information when processing Chinese tasks. To solve this problem, we constructed the BERT-BiLSTM-Attention-CRF model. In the model, we embeded the BERT pre-training language model that adopts the Whole Word Mask strategy, and added a document-level attention. Experimental results show that our method achieves good results in the MSRA corpus, and F1 reaches 95.00%.

Details

Language :
English
Volume :
336
Database :
OpenAIRE
Journal :
MATEC Web of Conferences
Accession number :
edsair.doi.dedup.....657402b1c5dd44dd12134258e1dc8f46