Back to Search Start Over

A federated learning scheme meets dynamic differential privacy

Authors :
Shengnan Guo
Xibin Wang
Shigong Long
Hai Liu
Liu Hai
Toong Hai Sam
Source :
CAAI Transactions on Intelligence Technology, Vol 8, Iss 3, Pp 1087-1100 (2023)
Publication Year :
2023
Publisher :
Wiley, 2023.

Abstract

Abstract Federated learning is a widely used distributed learning approach in recent years, however, despite model training from collecting data become to gathering parameters, privacy violations may occur when publishing and sharing models. A dynamic approach is proposed to add Gaussian noise more effectively and apply differential privacy to federal deep learning. Concretely, it is abandoning the traditional way of equally distributing the privacy budget ϵ and adjusting the privacy budget to accommodate gradient descent federation learning dynamically, where the parameters depend on computation derived to avoid the impact on the algorithm that hyperparameters are created manually. It also incorporates adaptive threshold cropping to control the sensitivity, and finally, moments accountant is used to counting the ϵ consumed on the privacy‐preserving, and learning is stopped only if the ϵtotal by clients setting is reached, this allows the privacy budget to be adequately explored for model training. The experimental results on real datasets show that the method training has almost the same effect as the model learning of non‐privacy, which is significantly better than the differential privacy method used by TensorFlow.

Details

Language :
English
ISSN :
24682322
Volume :
8
Issue :
3
Database :
Directory of Open Access Journals
Journal :
CAAI Transactions on Intelligence Technology
Publication Type :
Academic Journal
Accession number :
edsdoj.feeb1f2a28144cef86de0f23547d9aa5
Document Type :
article
Full Text :
https://doi.org/10.1049/cit2.12187