Back to Search Start Over

Towards efficient compression and communication for prototype-based decentralized learning

Authors :
Fernández-Piñeiro, Pablo
Ferández-Veiga, Manuel
Díaz-Redondo, Rebeca P.
Fernández-Vilas, Ana
González-Soto, Martín
Publication Year :
2024

Abstract

In prototype-based federated learning, the exchange of model parameters between clients and the master server is replaced by transmission of prototypes or quantized versions of the data samples to the aggregation server. A fully decentralized deployment of prototype-based learning, without a central agregartor of prototypes, is more robust upon network failures and reacts faster to changes in the statistical distribution of the data, suggesting potential advantages and quick adaptation in dynamic learning tasks, e.g., when the data sources are IoT devices or when data is non-iid. In this paper, we consider the problem of designing a communication-efficient decentralized learning system based on prototypes. We address the challenge of prototype redundancy by leveraging on a twofold data compression technique, i.e., sending only update messages if the prototypes are informationtheoretically useful (via the Jensen-Shannon distance), and using clustering on the prototypes to compress the update messages used in the gossip protocol. We also use parallel instead of sequential gossiping, and present an analysis of its age-of-information (AoI). Our experimental results show that, with these improvements, the communications load can be substantially reduced without decreasing the convergence rate of the learning algorithm.<br />Comment: 15 pages, 2 tables, 7 figures, 6 algorithms

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.09267
Document Type :
Working Paper