1. Stability and generalization of graph convolutional networks in eigen-domains.
- Author
-
Ng, Michael K. and Yip, Andy
- Subjects
- *
SUPERVISED learning , *GENERALIZATION , *MACHINE learning , *LEARNING problems , *CONVOLUTIONAL neural networks - Abstract
Graph Convolution Networks (GCNs) have been shown to be very effective in utilizing pair-wise relationships across samples. They have been successfully applied to solve various machine learning problems in practice. In many applications, the construction of GCNs involves more than one layer. However, their generalization and stability analysis are limited. The main aim of this paper is to analyze GCNs with two layers. The formulation is based on transductive semi-supervised learning and the filtering is done in the eigen-domain. We show the uniform stability of the neural network and the convergence of the generalization gap to zero. The analysis of two-layer GCN is more involved than the single-layer case and requires some new estimates of the neural network's quantities. The analysis confirms the usefulness of GCNs. It also sheds light on the design of the neural network, for instance, how the data should be scaled to achieve the uniform stability of the learning process. Some experimental results on benchmark datasets are presented to illustrate the theory. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF