Back to Search Start Over

Pseudo Contrastive Learning for Graph-based Semi-supervised Learning

Authors :
Lu, Weigang
Guan, Ziyu
Zhao, Wei
Yang, Yaming
Lv, Yuanhai
Xing, Lining
Yu, Baosheng
Tao, Dacheng
Publication Year :
2023

Abstract

Pseudo Labeling is a technique used to improve the performance of semi-supervised Graph Neural Networks (GNNs) by generating additional pseudo-labels based on confident predictions. However, the quality of generated pseudo-labels has been a longstanding concern due to the sensitivity of the classification objective with respect to the given labels. To avoid the untrustworthy classification supervision indicating ``a node belongs to a specific class,'' we favor the fault-tolerant contrasting supervision demonstrating ``two nodes do not belong to the same class.'' Thus, the problem of generating high-quality pseudo-labels is then transformed into a relaxed version, i.e., identifying reliable negative pairs. To achieve this, we propose a general framework for GNNs, termed Pseudo Contrastive Learning (PCL). It separates two nodes whose positive and negative pseudo-labels target the same class. To incorporate topological knowledge into learning, we devise a topologically weighted contrastive loss that spends more effort separating negative pairs with smaller topological distances. Experimentally, we apply PCL to various GNNs, which consistently outperform their counterparts using other popular general techniques on five real-world graphs.<br />Comment: Under Review

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2302.09532
Document Type :
Working Paper