Back to Search
Start Over
Data Quality-Aware Client Selection in Heterogeneous Federated Learning.
- Source :
-
Mathematics (2227-7390) . Oct2024, Vol. 12 Issue 20, p3229. 17p. - Publication Year :
- 2024
-
Abstract
- Federated Learning (FL) enables decentralized data utilization while maintaining edge user privacy, but it faces challenges due to statistical heterogeneity. Existing approaches address client drift and data heterogeneity issues. However, real-world settings often involve low-quality data with noisy features, such as covariate drift or adversarial samples, which are usually ignored. Noisy samples significantly impact the global model's accuracy and convergence rate. Assessing data quality and selectively aggregating updates from high-quality clients is crucial, but dynamically perceiving data quality without additional computations or data exchanges is challenging. In this paper, we introduce the FedDQA (Federated learning via Data Quality-Aware) (FedDQA) framework. We discover increased data noise leads to slower loss reduction during local model training. We propose a loss sharpness-based Data-Quality-Awareness (DQA) metric to differentiate between high-quality and low-quality data. Based on the DQA, we design a client selection algorithm that strategically selects participant clients to reduce the negative impact of noisy clients. Experiment results indicate that FedDQA significantly outperforms the baselines. Notably, it achieves up to a 4% increase in global model accuracy and demonstrates faster convergence rates. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 22277390
- Volume :
- 12
- Issue :
- 20
- Database :
- Academic Search Index
- Journal :
- Mathematics (2227-7390)
- Publication Type :
- Academic Journal
- Accession number :
- 180526375
- Full Text :
- https://doi.org/10.3390/math12203229