Back to Search Start Over

Revisiting Heterophily For Graph Neural Networks

Authors :
Luan, Sitao
Hua, Chenqing
Lu, Qincheng
Zhu, Jiaqi
Zhao, Mingde
Zhang, Shuyuan
Chang, Xiao-Wen
Precup, Doina
Publication Year :
2022

Abstract

Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using graph structures based on the relational inductive bias (homophily assumption). While GNNs have been commonly believed to outperform NNs in real-world tasks, recent work has identified a non-trivial set of datasets where their performance compared to NNs is not satisfactory. Heterophily has been considered the main cause of this empirical observation and numerous works have been put forward to address it. In this paper, we first revisit the widely used homophily metrics and point out that their consideration of only graph-label consistency is a shortcoming. Then, we study heterophily from the perspective of post-aggregation node similarity and define new homophily metrics, which are potentially advantageous compared to existing ones. Based on this investigation, we prove that some harmful cases of heterophily can be effectively addressed by local diversification operation. Then, we propose the Adaptive Channel Mixing (ACM), a framework to adaptively exploit aggregation, diversification and identity channels node-wisely to extract richer localized information for diverse node heterophily situations. ACM is more powerful than the commonly used uni-channel framework for node classification tasks on heterophilic graphs and is easy to be implemented in baseline GNN layers. When evaluated on 10 benchmark node classification tasks, ACM-augmented baselines consistently achieve significant performance gain, exceeding state-of-the-art GNNs on most tasks without incurring significant computational burden.<br />Comment: Published at 36th Conference on Neural Information Processing Systems (NeurIPS 2022). arXiv admin note: substantial text overlap with arXiv:2109.05641

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2210.07606
Document Type :
Working Paper