Back to Search Start Over

Robust Asynchronous Federated Learning With Time-Weighted and Stale Model Aggregation

Authors :
Miao, Yinbin
Liu, Ziteng
Li, Xinghua
Li, Meng
Li, Hongwei
Choo, Kim-Kwang Raymond
Deng, Robert H.
Source :
IEEE Transactions on Dependable and Secure Computing; 2024, Vol. 21 Issue: 4 p2361-2375, 15p
Publication Year :
2024

Abstract

Federated Learning (FL) ensures collaborative learning among multiple clients while maintaining data locally. However, the traditional synchronous FL solutions have lower accuracy and require more communication time in scenarios where most devices drop out during learning. Therefore, we propose an <underline>Asy</underline>nchronous <underline>F</underline>ederated <underline>L</underline>earning (AsyFL) scheme using time-weighted and stale model aggregation, which effectively solves the problem of poor model performance due to the heterogeneity of devices. Then, we integrate Symmetric Homomorphic Encryption (SHE) into AsyFL to propose <underline>Asy</underline>nchronous <underline>P</underline>rivacy-<underline>P</underline>reserving <underline>F</underline>ederated <underline>L</underline>earning (Asy-PPFL), which protects the privacy of clients and achieves lightweight computing. Privacy analysis shows that Asy-PPFL is indistinguishable under Known Plaintext Attack (KPA) and convergence analysis proves the effectiveness of our schemes. A large number of experiments show that AsyFL and Asy-PPFL can achieve the highest accuracy of 58.40% and 58.26% on Cifar-10 dataset when most clients (i.e., 80%) are offline or delayed, respectively.

Details

Language :
English
ISSN :
15455971
Volume :
21
Issue :
4
Database :
Supplemental Index
Journal :
IEEE Transactions on Dependable and Secure Computing
Publication Type :
Periodical
Accession number :
ejs66947015
Full Text :
https://doi.org/10.1109/TDSC.2023.3304788