Back to Search Start Over

OrthoReg: Improving Graph-regularized MLPs via Orthogonality Regularization

Authors :
Zhang, Hengrui
Wang, Shen
Ioannidis, Vassilis N.
Adeshina, Soji
Zhang, Jiani
Qin, Xiao
Faloutsos, Christos
Zheng, Da
Karypis, George
Yu, Philip S.
Publication Year :
2023

Abstract

Graph Neural Networks (GNNs) are currently dominating in modeling graph-structure data, while their high reliance on graph structure for inference significantly impedes them from widespread applications. By contrast, Graph-regularized MLPs (GR-MLPs) implicitly inject the graph structure information into model weights, while their performance can hardly match that of GNNs in most tasks. This motivates us to study the causes of the limited performance of GR-MLPs. In this paper, we first demonstrate that node embeddings learned from conventional GR-MLPs suffer from dimensional collapse, a phenomenon in which the largest a few eigenvalues dominate the embedding space, through empirical observations and theoretical analysis. As a result, the expressive power of the learned node representations is constrained. We further propose OrthoReg, a novel GR-MLP model to mitigate the dimensional collapse issue. Through a soft regularization loss on the correlation matrix of node embeddings, OrthoReg explicitly encourages orthogonal node representations and thus can naturally avoid dimensionally collapsed representations. Experiments on traditional transductive semi-supervised classification tasks and inductive node classification for cold-start scenarios demonstrate its effectiveness and superiority.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.00109
Document Type :
Working Paper