Back to Search Start Over

HpLapGCN: Hypergraph p-Laplacian graph convolutional networks.

Authors :
Fu, Sichao
Liu, Weifeng
Zhou, Yicong
Nie, Liqiang
Source :
Neurocomputing. Oct2019, Vol. 362, p166-174. 9p.
Publication Year :
2019

Abstract

Currently, the representation learning of a graph has been proved to be a significant technique to extract graph structured data features. In recent years, many graph representation learning (GRL) algorithms, such as Laplacian Eigenmaps (LE), Node2vec and graph convolutional networks (GCN), have been reported and have achieved great success on node classification tasks. The most representative GCN fuses the feature information and structure information of data, which aims to generalize convolutional neural networks (CNN) to learn data features with arbitrary structure. However, how to exactly express the structure information of data is still an enormous challenge. In this paper, we utilize hypergraph p -Laplacian to preserve the local geometry of samples and then propose an effective variant of GCN, i.e. hypergraph p -Laplacian graph convolutional networks (HpLapGCN). Since hypergraph p -Laplacian is a generalization of the graph Laplacian, HpLapGCN model shows great potential to learn more representative data features. In particular, we simplify and deduce a one-order approximation of spectral hypergraph p -Laplacian convolutions. Thus, we can get a more efficient layer-wise aggregate rule. Extensive experiment results on the Citeseer and Cora datasets prove that our proposed model achieves better performance compare with GCN and p -Laplacian GCN (pLapGCN). [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
362
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
138179335
Full Text :
https://doi.org/10.1016/j.neucom.2019.06.068