Back to Search Start Over

Recurrent Transformer for Dynamic Graph Representation Learning with Edge Temporal States

Authors :
Hu, Shengxiang
Zou, Guobing
Lin, Shiyi
Wu, Liangrui
Zhou, Chenyang
Zhang, Bofeng
Chen, Yixin
Publication Year :
2023
Publisher :
arXiv, 2023.

Abstract

Dynamic graph representation learning is growing as a trending yet challenging research task owing to the widespread demand for graph data analysis in real world applications. Despite the encouraging performance of many recent works that build upon recurrent neural networks (RNNs) and graph neural networks (GNNs), they fail to explicitly model the impact of edge temporal states on node features over time slices. Additionally, they are challenging to extract global structural features because of the inherent over-smoothing disadvantage of GNNs, which further restricts the performance. In this paper, we propose a recurrent difference graph transformer (RDGT) framework, which firstly assigns the edges in each snapshot with various types and weights to illustrate their specific temporal states explicitly, then a structure-reinforced graph transformer is employed to capture the temporal node representations by a recurrent learning paradigm. Experimental results on four real-world datasets demonstrate the superiority of RDGT for discrete dynamic graph representation learning, as it consistently outperforms competing methods in dynamic link prediction tasks.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....853cbead787d5765577156d535f3454a
Full Text :
https://doi.org/10.48550/arxiv.2304.10079