Back to Search Start Over

FedSkel

Authors :
Xin Guo
Weisheng Zhao
Junyu Luo
Xucheng Ye
Jianlei Yang
Source :
CIKM
Publication Year :
2021
Publisher :
ACM, 2021.

Abstract

Federated learning aims to protect users' privacy while performing data analysis from different participants. However, it is challenging to guarantee the training efficiency on heterogeneous systems due to the various computational capabilities and communication bottlenecks. In this work, we propose FedSkel to enable computation-efficient and communication-efficient federated learning on edge devices by only updating the model's essential parts, named skeleton networks. FedSkel is evaluated on real edge devices with imbalanced datasets. Experimental results show that it could achieve up to 5.52$\times$ speedups for CONV layers' back-propagation, 1.82$\times$ speedups for the whole training process, and reduce 64.8% communication cost, with negligible accuracy loss.<br />Comment: CIKM 2021

Details

Database :
OpenAIRE
Journal :
Proceedings of the 30th ACM International Conference on Information & Knowledge Management
Accession number :
edsair.doi.dedup.....8b29e55dfd4da8b0aadb4555766c48be
Full Text :
https://doi.org/10.1145/3459637.3482107