1. Medication event extraction in clinical notes: Contribution of the WisPerMed team to the n2c2 2022 challenge.
- Author
-
Schäfer H, Idrissi-Yaghir A, Bewersdorff J, Frihat S, Friedrich CM, and Zesch T
- Subjects
- Natural Language Processing, Language
- Abstract
In this work, we describe the findings of the 'WisPerMed' team from their participation in Track 1 (Contextualized Medication Event Extraction) of the n2c2 2022 challenge. We tackle two tasks: (i) medication extraction, which involves extracting all mentions of medications from the clinical notes, and (ii) event classification, which involves classifying the medication mentions based on whether a change in the medication has been discussed. To address the long lengths of clinical texts, which often exceed the maximum token length that models based on the transformer-architecture can handle, various approaches, such as the use of ClinicalBERT with a sliding window approach and Longformer-based models, are employed. In addition, domain adaptation through masked language modeling and preprocessing steps such as sentence splitting are utilized to improve model performance. Since both tasks were treated as named entity recognition (NER) problems, a sanity check was performed in the second release to eliminate possible weaknesses in the medication detection itself. This check used the medication spans to remove false positive predictions and replace missed tokens with the highest softmax probability of the disposition types. The effectiveness of these approaches is evaluated through multiple submissions to the tasks, as well as with post-challenge results, with a focus on the DeBERTa v3 model and its disentangled attention mechanism. Results show that the DeBERTa v3 model performs well in both the NER task and the event classification task., Competing Interests: Declaration of Competing Interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper., (Copyright © 2023. Published by Elsevier Inc.)
- Published
- 2023
- Full Text
- View/download PDF