Back to Search Start Over

Differentially Private Kernel Inducing Points (DP-KIP) for Privacy-preserving Data Distillation

Authors :
Vinaroz, Margarita
Park, Mi Jung
Publication Year :
2023

Abstract

While it is tempting to believe that data distillation preserves privacy, distilled data's empirical robustness against known attacks does not imply a provable privacy guarantee. Here, we develop a provably privacy-preserving data distillation algorithm, called differentially private kernel inducing points (DP-KIP). DP-KIP is an instantiation of DP-SGD on kernel ridge regression (KRR). Following a recent work, we use neural tangent kernels and minimize the KRR loss to estimate the distilled datapoints (i.e., kernel inducing points). We provide a computationally efficient JAX implementation of DP-KIP, which we test on several popular image and tabular datasets to show its efficacy in data distillation with differential privacy guarantees.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....c1234b89bf6315a9e98767b1b9136872