Back to Search Start Over

Decentralized federated learning through proxy model sharing

Authors :
Shivam Kalra
Junfeng Wen
Jesse C. Cresswell
Maksims Volkovs
H. R. Tizhoosh
Source :
Nature Communications, Vol 14, Iss 1, Pp 1-10 (2023)
Publication Year :
2023
Publisher :
Nature Portfolio, 2023.

Abstract

Abstract Institutions in highly regulated domains such as finance and healthcare often have restrictive rules around data sharing. Federated learning is a distributed learning framework that enables multi-institutional collaborations on decentralized data with improved protection for each collaborator’s data privacy. In this paper, we propose a communication-efficient scheme for decentralized federated learning called ProxyFL, or proxy-based federated learning. Each participant in ProxyFL maintains two models, a private model, and a publicly shared proxy model designed to protect the participant’s privacy. Proxy models allow efficient information exchange among participants without the need of a centralized server. The proposed method eliminates a significant limitation of canonical federated learning by allowing model heterogeneity; each participant can have a private model with any architecture. Furthermore, our protocol for communication by proxy leads to stronger privacy guarantees using differential privacy analysis. Experiments on popular image datasets, and a cancer diagnostic problem using high-quality gigapixel histology whole slide images, show that ProxyFL can outperform existing alternatives with much less communication overhead and stronger privacy.

Subjects

Subjects :
Science

Details

Language :
English
ISSN :
20411723
Volume :
14
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Nature Communications
Publication Type :
Academic Journal
Accession number :
edsdoj.8dd8582fd7db477994702bdb0284158c
Document Type :
article
Full Text :
https://doi.org/10.1038/s41467-023-38569-4