Back to Search Start Over

Differentially Private Neural Network Training under Hidden State Assumption

Authors :
Chen, Ding
Liu, Chen
Publication Year :
2024

Abstract

We present a novel approach called differentially private stochastic block coordinate descent (DP-SBCD) for training neural networks with provable guarantees of differential privacy under the hidden state assumption. Our methodology incorporates Lipschitz neural networks and decomposes the training process of the neural network into sub-problems, each corresponding to the training of a specific layer. By doing so, we extend the analysis of differential privacy under the hidden state assumption to encompass non-convex problems and algorithms employing proximal gradient descent. Furthermore, in contrast to existing methods, we adopt a novel approach by utilizing calibrated noise sampled from adaptive distributions, yielding improved empirical trade-offs between utility and privacy.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.08233
Document Type :
Working Paper