Back to Search
Start Over
Domain specific BERT representation for Named Entity Recognition of lab protocol
- Publication Year :
- 2020
-
Abstract
- Supervised models trained to predict properties from representations have been achieving high accuracy on a variety of tasks. For instance, the BERT family seems to work exceptionally well on the downstream task from NER tagging to the range of other linguistic tasks. But the vocabulary used in the medical field contains a lot of different tokens used only in the medical industry such as the name of different diseases, devices, organisms, medicines, etc. that makes it difficult for traditional BERT model to create contextualized embedding. In this paper, we are going to illustrate the System for Named Entity Tagging based on Bio-Bert. Experimental results show that our model gives substantial improvements over the baseline and stood the fourth runner up in terms of F1 score, and first runner up in terms of Recall with just 2.21 F1 score behind the best one.<br />Comment: EMNLP 2020 Workshop; 5 pages
- Subjects :
- Computer Science - Computation and Language
68T50
I.2.7
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2012.11145
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.18653/v1/2020.wnut-1.34