Back to Search Start Over

DeeperGCN: All You Need to Train Deeper GCNs

Authors :
Li, Guohao
Xiong, Chenxin
Thabet, Ali
Ghanem, Bernard
Publication Year :
2020

Abstract

Graph Convolutional Networks (GCNs) have been drawing significant attention with the power of representation learning on graphs. Unlike Convolutional Neural Networks (CNNs), which are able to take advantage of stacking very deep layers, GCNs suffer from vanishing gradient, over-smoothing and over-fitting issues when going deeper. These challenges limit the representation power of GCNs on large-scale graphs. This paper proposes DeeperGCN that is capable of successfully and reliably training very deep GCNs. We define differentiable generalized aggregation functions to unify different message aggregation operations (e.g. mean, max). We also propose a novel normalization layer namely MsgNorm and a pre-activation version of residual connections for GCNs. Extensive experiments on Open Graph Benchmark (OGB) show DeeperGCN significantly boosts performance over the state-of-the-art on the large scale graph learning tasks of node property prediction and graph property prediction. Please visit https://www.deepgcns.org for more information.<br />Comment: This work is still working in process. More results will be updated in the future version. Project website: https://www.deepgcns.org

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2006.07739
Document Type :
Working Paper