Back to Search Start Over

Class-aware progressive self-training for learning convolutional networks on graphs.

Authors :
Chen, Ke
Wu, Weining
Source :
Expert Systems with Applications. Mar2024:Part B, Vol. 238, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Learning convolutional networks on graphs have been a popular topic for machine learning on graph-structured data and achieved state-of-the-art results on various practical tasks. However, most existing works ignore the impact of per-class distribution, therefore their performance may be limited due to the diversity of various categories. In this paper, we propose a novel class-aware progressive self-training (CPS) algorithm for training graph convolutional networks (GCNs). Compared to other self-training algorithms for GCNs' learning, the proposed CPS algorithm leverages the class distribution to update the original graph structure in each self-training loop, including: (a) find these high-confident unlabeled nodes in the graph for each category to add pseudo labels, in order to enlarge the current set of labeled nodes; (b) delete these noisy edges between different classes for graph sparsification. Then, the optimized graph is used for next self-training loops in hopes of enhancing the classification performance. We evaluate the proposed CPS on several datasets commonly used for GCNs' learning, and the experimental results show that the proposed CPS algorithm outperforms other baselines. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
238
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
173707430
Full Text :
https://doi.org/10.1016/j.eswa.2023.121805