Back to Search Start Over

Improving Expressivity of GNNs with Subgraph-specific Factor Embedded Normalization

Authors :
Chen, Kaixuan
Liu, Shunyu
Zhu, Tongtian
Zheng, Tongya
Zhang, Haofei
Feng, Zunlei
Ye, Jingwen
Song, Mingli
Publication Year :
2023

Abstract

Graph Neural Networks (GNNs) have emerged as a powerful category of learning architecture for handling graph-structured data. However, existing GNNs typically ignore crucial structural characteristics in node-induced subgraphs, which thus limits their expressiveness for various downstream tasks. In this paper, we strive to strengthen the representative capabilities of GNNs by devising a dedicated plug-and-play normalization scheme, termed as SUbgraph-sPEcific FactoR Embedded Normalization (SuperNorm), that explicitly considers the intra-connection information within each node-induced subgraph. To this end, we embed the subgraph-specific factor at the beginning and the end of the standard BatchNorm, as well as incorporate graph instance-specific statistics for improved distinguishable capabilities. In the meantime, we provide theoretical analysis to support that, with the elaborated SuperNorm, an arbitrary GNN is at least as powerful as the 1-WL test in distinguishing non-isomorphism graphs. Furthermore, the proposed SuperNorm scheme is also demonstrated to alleviate the over-smoothing phenomenon. Experimental results related to predictions of graph, node, and link properties on the eight popular datasets demonstrate the effectiveness of the proposed method. The code is available at https://github.com/chenchkx/SuperNorm.<br />Comment: 13 pages, 7 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.19903
Document Type :
Working Paper