Back to Search
Start Over
Chinese named entity recognition model based on BERT
- Source :
- MATEC Web of Conferences, Vol 336, p 06021 (2021)
- Publication Year :
- 2021
- Publisher :
- EDP Sciences, 2021.
-
Abstract
- Nowadays, most deep learning models ignore Chinese habits and global information when processing Chinese tasks. To solve this problem, we constructed the BERT-BiLSTM-Attention-CRF model. In the model, we embeded the BERT pre-training language model that adopts the Whole Word Mask strategy, and added a document-level attention. Experimental results show that our method achieves good results in the MSRA corpus, and F1 reaches 95.00%.
- Subjects :
- business.industry
Deep learning
02 engineering and technology
computer.software_genre
Global information
Named-entity recognition
lcsh:TA1-2040
020204 information systems
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Chemistry (relationship)
Language model
Artificial intelligence
business
lcsh:Engineering (General). Civil engineering (General)
computer
Word (computer architecture)
Natural language processing
Subjects
Details
- Language :
- English
- Volume :
- 336
- Database :
- OpenAIRE
- Journal :
- MATEC Web of Conferences
- Accession number :
- edsair.doi.dedup.....657402b1c5dd44dd12134258e1dc8f46