Back to Search Start Over

Continual Multimodal Knowledge Graph Construction

Authors :
Chen, Xiang
Zhang, Jintian
Wang, Xiaohan
Zhang, Ningyu
Wu, Tongtong
Wang, Yuxiang
Wang, Yongheng
Chen, Huajun
Publication Year :
2023

Abstract

Current Multimodal Knowledge Graph Construction (MKGC) models struggle with the real-world dynamism of continuously emerging entities and relations, often succumbing to catastrophic forgetting-loss of previously acquired knowledge. This study introduces benchmarks aimed at fostering the development of the continual MKGC domain. We further introduce MSPT framework, designed to surmount the shortcomings of existing MKGC approaches during multimedia data processing. MSPT harmonizes the retention of learned knowledge (stability) and the integration of new data (plasticity), outperforming current continual learning and multimodal methods. Our results confirm MSPT's superior performance in evolving knowledge environments, showcasing its capacity to navigate balance between stability and plasticity.<br />Comment: IJCAI 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.08698
Document Type :
Working Paper