Back to Search
Start Over
FedSkel
- Source :
- CIKM
- Publication Year :
- 2021
- Publisher :
- ACM, 2021.
-
Abstract
- Federated learning aims to protect users' privacy while performing data analysis from different participants. However, it is challenging to guarantee the training efficiency on heterogeneous systems due to the various computational capabilities and communication bottlenecks. In this work, we propose FedSkel to enable computation-efficient and communication-efficient federated learning on edge devices by only updating the model's essential parts, named skeleton networks. FedSkel is evaluated on real edge devices with imbalanced datasets. Experimental results show that it could achieve up to 5.52$\times$ speedups for CONV layers' back-propagation, 1.82$\times$ speedups for the whole training process, and reduce 64.8% communication cost, with negligible accuracy loss.<br />Comment: CIKM 2021
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Edge device
Computer Science - Artificial Intelligence
Computer science
Process (engineering)
Distributed computing
Skeleton (category theory)
Federated learning
Machine Learning (cs.LG)
Artificial Intelligence (cs.AI)
Computer Science - Distributed, Parallel, and Cluster Computing
Work (electrical)
Distributed, Parallel, and Cluster Computing (cs.DC)
Distributed learning
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the 30th ACM International Conference on Information & Knowledge Management
- Accession number :
- edsair.doi.dedup.....8b29e55dfd4da8b0aadb4555766c48be
- Full Text :
- https://doi.org/10.1145/3459637.3482107