Back to Search Start Over

EMERGE: Integrating RAG for Improved Multimodal EHR Predictive Modeling

Authors :
Zhu, Yinghao
Ren, Changyu
Wang, Zixiang
Zheng, Xiaochen
Xie, Shiyun
Feng, Junlan
Zhu, Xi
Li, Zhoujun
Ma, Liantao
Pan, Chengwei
Publication Year :
2024

Abstract

The integration of multimodal Electronic Health Records (EHR) data has notably advanced clinical predictive capabilities. However, current models that utilize clinical notes and multivariate time-series EHR data often lack the necessary medical context for precise clinical tasks. Previous methods using knowledge graphs (KGs) primarily focus on structured knowledge extraction. To address this, we propose EMERGE, a Retrieval-Augmented Generation (RAG) driven framework aimed at enhancing multimodal EHR predictive modeling. Our approach extracts entities from both time-series data and clinical notes by prompting Large Language Models (LLMs) and aligns them with professional PrimeKG to ensure consistency. Beyond triplet relationships, we include entities' definitions and descriptions to provide richer semantics. The extracted knowledge is then used to generate task-relevant summaries of patients' health statuses. These summaries are fused with other modalities utilizing an adaptive multimodal fusion network with cross-attention. Extensive experiments on the MIMIC-III and MIMIC-IV datasets for in-hospital mortality and 30-day readmission tasks demonstrate the superior performance of the EMERGE framework compared to baseline models. Comprehensive ablation studies and analyses underscore the efficacy of each designed module and the framework's robustness to data sparsity. EMERGE significantly enhances the use of multimodal EHR data in healthcare, bridging the gap with nuanced medical contexts crucial for informed clinical predictions.<br />Comment: arXiv admin note: text overlap with arXiv:2402.07016

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.00036
Document Type :
Working Paper