Back to Search Start Over

medBERT.de: A comprehensive German BERT model for the medical domain.

Authors :
Bressem, Keno K.
Papaioannou, Jens-Michalis
Grundmann, Paul
Borchert, Florian
Adams, Lisa C.
Liu, Leonhard
Busch, Felix
Xu, Lina
Loyen, Jan P.
Niehues, Stefan M.
Augustin, Moritz
Grosser, Lennart
Makowski, Marcus R.
Aerts, Hugo J.W.L.
Löser, Alexander
Source :
Expert Systems with Applications. Mar2024:Part C, Vol. 237, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

• medBERT.de : German BERT model tailored for medical domain expertise. • Trained using 4.7 million diverse medical documents for optimal results. • Achieved new state-of-the-art on eight medical NLP benchmarks. • Efficient tokenization has minor role, focus on domain-specific training. • Public release of pre-trained model weights and new benchmarks for research. This paper presents medBERT.de , a pre-trained German BERT model specifically designed for the German medical domain. The model has been trained on a large corpus of 4.7 Million German medical documents and has been shown to achieve new state-of-the-art performance on eight different medical benchmarks covering a wide range of disciplines and medical document types. In addition to evaluating the overall performance of the model, this paper also conducts a more in-depth analysis of its capabilities. We investigate the impact of data deduplication on the model's performance, as well as the potential benefits of using more efficient tokenization methods. Our results indicate that domain-specific models such as medBERT.de are particularly useful for longer texts, and that deduplication of training data does not necessarily lead to improved performance. Furthermore, we found that efficient tokenization plays only a minor role in improving model performance, and attribute most of the improved performance to the large amount of training data. To encourage further research, the pre-trained model weights and new benchmarks based on radiological data are made publicly available for use by the scientific community. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
237
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
173631531
Full Text :
https://doi.org/10.1016/j.eswa.2023.121598