Back to Search Start Over

Tensorized Hypergraph Neural Networks

Authors :
Wang, Maolin
Zhen, Yaoming
Pan, Yu
Zhao, Yao
Zhuang, Chenyi
Xu, Zenglin
Guo, Ruocheng
Zhao, Xiangyu
Publication Year :
2023

Abstract

Hypergraph neural networks (HGNN) have recently become attractive and received significant attention due to their excellent performance in various domains. However, most existing HGNNs rely on first-order approximations of hypergraph connectivity patterns, which ignores important high-order information. To address this issue, we propose a novel adjacency-tensor-based \textbf{T}ensorized \textbf{H}ypergraph \textbf{N}eural \textbf{N}etwork (THNN). THNN is a faithful hypergraph modeling framework through high-order outer product feature message passing and is a natural tensor extension of the adjacency-matrix-based graph neural networks. The proposed THNN is equivalent to a high-order polynomial regression scheme, which enables THNN with the ability to efficiently extract high-order information from uniform hypergraphs. Moreover, in consideration of the exponential complexity of directly processing high-order outer product features, we propose using a partially symmetric CP decomposition approach to reduce model complexity to a linear degree. Additionally, we propose two simple yet effective extensions of our method for non-uniform hypergraphs commonly found in real-world applications. Results from experiments on two widely used {hypergraph datasets for 3-D visual object classification} show the model's promising performance.<br />Comment: SIAM International Conference on Data Mining (SDM24)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2306.02560
Document Type :
Working Paper