Back to Search Start Over

Enhancing Graph Neural Networks by a High-quality Aggregation of Beneficial Information.

Authors :
Liu, Chuang
Wu, Jia
Liu, Weiwei
Hu, Wenbin
Source :
Neural Networks. Oct2021, Vol. 142, p20-33. 14p.
Publication Year :
2021

Abstract

Graph Neural Networks (GNNs), such as GCN, GraphSAGE, GAT, and SGC, have achieved state-of-the-art performance on a wide range of graph-based tasks. These models all use a technique called neighborhood aggregation, in which the embedding of each node is updated by aggregating the embeddings of its neighbors. However, not all information aggregated from neighbors is beneficial. In some cases, a portion of the neighbor information may be harmful to the downstream tasks. For the high-quality aggregation of beneficial information, we propose a flexible method EGAI (Enhancing Graph neural networks by a high-quality Aggregation of beneficial Information). The core concept of this method is to filter out the redundant and harmful information by removing specific edges during each training epoch. The practical and theoretical motivations, considerations, and strategies related to this method are discussed in detail. EGAI is a general method that can be combined with many backbone models (e.g., GCN, GraphSAGE, GAT, and SGC) to enhance their performance in the node classification task. In addition, EGAI reduces the convergence speed of over-smoothing that occurs when models are deepened. Extensive experiments on three real-world networks demonstrate that EGAI indeed improves the performance for both shallow and deep GNN models, and to some extent, mitigates over-smoothing. The code is available at https://github.com/liucoo/egai. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
142
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
152163069
Full Text :
https://doi.org/10.1016/j.neunet.2021.04.025