1. FedCPD: Addressing label distribution skew in federated learning with class proxy decoupling and proxy regularization.
- Author
-
He, Zaobo, Li, Yusen, Seo, Daehee, and Cai, Zhipeng
- Subjects
- *
FEDERATED learning , *SKEWNESS (Probability theory) , *TRANSFORMER models - Abstract
Federated learning (FL) enables multiple data sources to collaboratively train a global model for Multi-source Visual Fusion and Understanding (MSVFU) without centralizing raw data. However, it is difficult for such a global model to perform optimally due to the label distribution skew of images from various data sources. This paper presents FedCPD, a novel federated learning framework that effectively mitigates label distribution skew, enhancing global model accuracy and generalization capabilities through Class Proxy Decoupling aggregation and Proxy Regularization. First, Class Proxy Decoupling aggregation effectively decouples observed class proxies from missing ones in client updates, ensuring that the global model's aggregation process is not adversely affected by inaccurate class proxy updates. Second, Proxy Regularization proactively increases the attention of class proxies to features of other classes during local training, thereby enhancing the model's generalization capability across diverse data sources. Additionally, we integrate a pre-trained Vision Transformer (ViT) feature extractor to enhance the global model's robustness against label distribution skew. Extensive evaluation on four public datasets with varying label distribution skew confirms the superior efficacy of our approach compared to existing methods. • Class proxy decoupling aggregation tackles label skew in federated learning. • Proxy regularization enhances cross-class learning in federated learning. • Integration of advanced feature extractor strengthens model against data skew. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF