Back to Search Start Over

Ha-gnn: a novel graph neural network based on hyperbolic attention.

Authors :
Qu, Hongbo
Song, Yu-Rong
Zhang, Minglei
Jiang, Guo-Ping
Li, Ruqi
Song, Bo
Source :
Neural Computing & Applications. Apr2024, p1-16.
Publication Year :
2024

Abstract

Graph neural networks (GNNs) are powerful tools for data mining on graph-structured data in various domains, such as social science, finance, and biology. However, most existing GNNs operate in Euclidean space and may fail to preserve the intrinsic network properties, such as self-similarity and hierarchy, that characterize many real-world graphs. Hyperbolic graph neural networks (HGNNs) address this limitation by embedding graphs into hyperbolic space, which can better capture the hierarchical structures of networks. However, HGNNs often involve complex computations in hyperbolic space or its tangent space during training, which may hinder their efficiency. In this paper, we propose hyperbolic attention graph neural networks (HA-GNN), which can leverage both network structure and node features for graph representation learning in an efficient way. Specifically, we design a structural properties attention mechanism that measures the structural connection between nodes based on their hyperbolic embeddings. We also design a node features attention mechanism that quantifies the feature similarity between nodes. We then combine these two attentions to obtain a hyperbolic attention that weights the relevance between all connected nodes. We conduct extensive experiments on five real-world networks and demonstrate that our model consistently and significantly outperforms other state-of-the-art methods. For example, on the Cora network, our model achieves an accuracy of 83.1 (± 0.4) on node classification tasks, which is 1.6% higher than the best baseline method in Euclidean space. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
176714461
Full Text :
https://doi.org/10.1007/s00521-024-09689-9