Back to Search Start Over

Supplement data in federated learning with a generator transparent to clients.

Authors :
Wang, Xiaoya
Zhu, Tianqing
Zhou, Wanlei
Source :
Information Sciences. May2024, Vol. 666, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Federated learning is a decentralized learning approach that shows promise for preserving users' privacy by avoiding local data sharing. However, the heterogeneous data in federated learning limits its applications in wider scopes. The data heterogeneity from diverse clients leads to weight divergence between local models and degrades the global performance of federated learning. To mitigate data heterogeneity, supplementing training data in federated learning has been explored and proven effective. However, traditional methods of supplementing data raise privacy concerns and increase learning costs. In this paper, we propose a solution to supplement training data with a generative model that is transparent to local clients. We keep the learning of the generative model on the server side and store the supplementary data from the generative model on the server side as well. This approach avoids collecting auxiliary data directly from local clients, reducing privacy concerns for them and preventing rising costs for local clients. To avoid loose learning on the real and synthetic samples, we constrain the optimization of the global model with a distance between the training global model and the distribution of the aggregated global model. Extensive experiments have verified that the synthetic data from the generative model improve the performance of federated learning, especially in a heterogeneous environment. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
666
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
176196221
Full Text :
https://doi.org/10.1016/j.ins.2024.120437