Back to Search Start Over

DPAR: Decoupled Graph Neural Networks with Node-Level Differential Privacy

Authors :
Zhang, Qiuchen
Lee, Hong kyu
Ma, Jing
Lou, Jian
Yang, Carl
Xiong, Li
Publication Year :
2022

Abstract

Graph Neural Networks (GNNs) have achieved great success in learning with graph-structured data. Privacy concerns have also been raised for the trained models which could expose the sensitive information of graphs including both node features and the structure information. In this paper, we aim to achieve node-level differential privacy (DP) for training GNNs so that a node and its edges are protected. Node DP is inherently difficult for GNNs because all direct and multi-hop neighbors participate in the calculation of gradients for each node via layer-wise message passing and there is no bound on how many direct and multi-hop neighbors a node can have, so existing DP methods will result in high privacy cost or poor utility due to high node sensitivity. We propose a Decoupled GNN with Differentially Private Approximate Personalized PageRank (DPAR) for training GNNs with an enhanced privacy-utility tradeoff. The key idea is to decouple the feature projection and message passing via a DP PageRank algorithm which learns the structure information and uses the top-$K$ neighbors determined by the PageRank for feature aggregation. By capturing the most important neighbors for each node and avoiding the layer-wise message passing, it bounds the node sensitivity and achieves improved privacy-utility tradeoff compared to layer-wise perturbation based methods. We theoretically analyze the node DP guarantee for the two processes combined together and empirically demonstrate better utilities of DPAR with the same level of node DP compared with state-of-the-art methods.<br />Comment: Accepted to The 2024 Web Conference

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2210.04442
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3589334.3645531