Back to Search Start Over

FedACK: Federated Adversarial Contrastive Knowledge Distillation for Cross-Lingual and Cross-Model Social Bot Detection

Authors :
Yang, Yingguang
Yang, Renyu
Peng, Hao
Li, Yangyang
Li, Tong
Liao, Yong
Zhou, Pengyuan
Publication Year :
2023

Abstract

Social bot detection is of paramount importance to the resilience and security of online social platforms. The state-of-the-art detection models are siloed and have largely overlooked a variety of data characteristics from multiple cross-lingual platforms. Meanwhile, the heterogeneity of data distribution and model architecture makes it intricate to devise an efficient cross-platform and cross-model detection framework. In this paper, we propose FedACK, a new federated adversarial contrastive knowledge distillation framework for social bot detection. We devise a GAN-based federated knowledge distillation mechanism for efficiently transferring knowledge of data distribution among clients. In particular, a global generator is used to extract the knowledge of global data distribution and distill it into each client's local model. We leverage local discriminator to enable customized model design and use local generator for data enhancement with hard-to-decide samples. Local training is conducted as multi-stage adversarial and contrastive learning to enable consistent feature spaces among clients and to constrain the optimization direction of local models, reducing the divergences between local and global models. Experiments demonstrate that FedACK outperforms the state-of-the-art approaches in terms of accuracy, communication efficiency, and feature space consistency.<br />Comment: Accepted by the ACM Web Conference 2023 (WWW'23)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2303.07113
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3543507.3583500