Back to Search Start Over

CODER: Knowledge infused cross-lingual medical term embedding for term normalization

Authors :
Yuan, Zheng
Zhao, Zhengyun
Sun, Haixia
Li, Jiao
Wang, Fei
Yu, Sheng
Publication Year :
2020

Abstract

This paper proposes CODER: contrastive learning on knowledge graphs for cross-lingual medical term representation. CODER is designed for medical term normalization by providing close vector representations for different terms that represent the same or similar medical concepts with cross-lingual support. We train CODER via contrastive learning on a medical knowledge graph (KG) named the Unified Medical Language System, where similarities are calculated utilizing both terms and relation triplets from KG. Training with relations injects medical knowledge into embeddings and aims to provide potentially better machine learning features. We evaluate CODER in zero-shot term normalization, semantic similarity, and relation classification benchmarks, which show that CODERoutperforms various state-of-the-art biomedical word embedding, concept embeddings, and contextual embeddings. Our codes and models are available at https://github.com/GanjinZero/CODER.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2011.02947
Document Type :
Working Paper