Back to Search Start Over

Protected Health Information Recognition by Fine-Tuning a Pre-training Transformer Model

Authors :
Seo Hyun Oh
Min Kang
Youngho Lee
Source :
Healthcare Informatics Research, Vol 28, Iss 1, Pp 16-24 (2022)
Publication Year :
2022
Publisher :
The Korean Society of Medical Informatics, 2022.

Abstract

Objectives De-identifying protected health information (PHI) in medical documents is important, and a prerequisite to de-identification is the identification of PHI entity names in clinical documents. This study aimed to compare the performance of three pre-training models that have recently attracted significant attention and to determine which model is more suitable for PHI recognition. Methods We compared the PHI recognition performance of deep learning models using the i2b2 2014 dataset. We used the three pre-training models—namely, bidirectional encoder representations from transformers (BERT), robustly optimized BERT pre-training approach (RoBERTa), and XLNet (model built based on Transformer-XL)—to detect PHI. After the dataset was tokenized, it was processed using an inside-outside-beginning tagging scheme and WordPiece-tokenized to place it into these models. Further, the PHI recognition performance was investigated using BERT, RoBERTa, and XLNet. Results Comparing the PHI recognition performance of the three models, it was confirmed that XLNet had a superior F1-score of 96.29%. In addition, when checking PHI entity performance evaluation, RoBERTa and XLNet showed a 30% improvement in performance compared to BERT. Conclusions Among the pre-training models used in this study, XLNet exhibited superior performance because word embedding was well constructed using the two-stream self-attention method. In addition, compared to BERT, RoBERTa and XLNet showed superior performance, indicating that they were more effective in grasping the context.

Details

Language :
English
ISSN :
20933681 and 2093369X
Volume :
28
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Healthcare Informatics Research
Publication Type :
Academic Journal
Accession number :
edsdoj.14c07e9a57674601991bed306a72f3bc
Document Type :
article
Full Text :
https://doi.org/10.4258/hir.2022.28.1.16