Back to Search Start Over

Federated Contrastive Learning for Personalized Semantic Communication

Authors :
Wang, Yining
Ni, Wanli
Yi, Wenqiang
Xu, Xiaodong
Zhang, Ping
Nallanathan, Arumugam
Publication Year :
2024

Abstract

In this letter, we design a federated contrastive learning (FedCL) framework aimed at supporting personalized semantic communication. Our FedCL enables collaborative training of local semantic encoders across multiple clients and a global semantic decoder owned by the base station. This framework supports heterogeneous semantic encoders since it does not require client-side model aggregation. Furthermore, to tackle the semantic imbalance issue arising from heterogeneous datasets across distributed clients, we employ contrastive learning to train a semantic centroid generator (SCG). This generator obtains representative global semantic centroids that exhibit intra-semantic compactness and inter-semantic separability. Consequently, it provides superior supervision for learning discriminative local semantic features. Additionally, we conduct theoretical analysis to quantify the convergence performance of FedCL. Simulation results verify the superiority of the proposed FedCL framework compared to other distributed learning benchmarks in terms of task performance and robustness under different numbers of clients and channel conditions, especially in low signal-to-noise ratio and highly heterogeneous data scenarios.<br />Comment: IEEE Communications Letters

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.09182
Document Type :
Working Paper