Back to Search Start Over

Communication-efficient federated learning via knowledge distillation

Authors :
Chuhan Wu
Fangzhao Wu
Lingjuan Lyu
Yongfeng Huang
Xing Xie
Source :
Nature Communications, Vol 13, Iss 1, Pp 1-8 (2022)
Publication Year :
2022
Publisher :
Nature Portfolio, 2022.

Abstract

This work presents a communication-efficient federated learning method that saves a major fraction of communication cost. It reveals the advantage of reciprocal learning in machine knowledge transfer and the evolutional low-rank properties of deep model updates.

Subjects

Subjects :
Science

Details

Language :
English
ISSN :
20411723
Volume :
13
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Nature Communications
Publication Type :
Academic Journal
Accession number :
edsdoj.89508449f9f4c23abb84000b24c63d0
Document Type :
article
Full Text :
https://doi.org/10.1038/s41467-022-29763-x