Back to Search Start Over

Multi-source Knowledge Enhanced Graph Attention Networks for Multimodal Fact Verification

Authors :
Cao, Han
Wei, Lingwei
Zhou, Wei
Hu, Songlin
Publication Year :
2024

Abstract

Multimodal fact verification is an under-explored and emerging field that has gained increasing attention in recent years. The goal is to assess the veracity of claims that involve multiple modalities by analyzing the retrieved evidence. The main challenge in this area is to effectively fuse features from different modalities to learn meaningful multimodal representations. To this end, we propose a novel model named Multi-Source Knowledge-enhanced Graph Attention Network (MultiKE-GAT). MultiKE-GAT introduces external multimodal knowledge from different sources and constructs a heterogeneous graph to capture complex cross-modal and cross-source interactions. We exploit a Knowledge-aware Graph Fusion (KGF) module to learn knowledge-enhanced representations for each claim and evidence and eliminate inconsistencies and noises introduced by redundant entities. Experiments on two public benchmark datasets demonstrate that our model outperforms other comparison methods, showing the effectiveness and superiority of the proposed model.<br />Comment: Accepted by ICME 2024

Subjects

Subjects :
Computer Science - Multimedia

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.10474
Document Type :
Working Paper