Back to Search Start Over

Flattened Graph Convolutional Networks For Recommendation

Authors :
Xu, Yue
Chen, Hao
Deng, Zengde
Bei, Yuanchen
Huang, Feiran
Source :
DLP-KDD 2022
Publication Year :
2022

Abstract

Graph Convolutional Networks (GCNs) and their variants have achieved significant performances on various recommendation tasks. However, many existing GCN models tend to perform recursive aggregations among all related nodes, which can arise severe computational burden to hinder their application to large-scale recommendation tasks. To this end, this paper proposes the flattened GCN~(FlatGCN) model, which is able to achieve superior performance with remarkably less complexity compared with existing models. Our main contribution is three-fold. First, we propose a simplified but powerful GCN architecture which aggregates the neighborhood information using one flattened GCN layer, instead of recursively. The aggregation step in FlatGCN is parameter-free such that it can be pre-computed with parallel computation to save memory and computational cost. Second, we propose an informative neighbor-infomax sampling method to select the most valuable neighbors by measuring the correlation among neighboring nodes based on a principled metric. Third, we propose a layer ensemble technique which improves the expressiveness of the learned representations by assembling the layer-wise neighborhood representations at the final layer. Extensive experiments on three datasets verify that our proposed model outperforms existing GCN models considerably and yields up to a few orders of magnitude speedup in training efficiency.<br />Comment: arXiv admin note: text overlap with arXiv:2006.04164

Details

Database :
arXiv
Journal :
DLP-KDD 2022
Publication Type :
Report
Accession number :
edsarx.2210.07769
Document Type :
Working Paper