Back to Search Start Over

Fundamental Limits of Decentralized Data Shuffling.

Authors :
Wan, Kai
Tuninetti, Daniela
Ji, Mingyue
Caire, Giuseppe
Piantanida, Pablo
Source :
IEEE Transactions on Information Theory. Jun2020, Vol. 66 Issue 6, p3616-3637. 22p.
Publication Year :
2020

Abstract

Data shuffling of training data among different computing nodes (workers) has been identified as a core element to improve the statistical performance of modern large-scale machine learning algorithms. Data shuffling is often considered as one of the most significant bottlenecks in such systems due to the heavy communication load. Under a master-worker architecture (where a master has access to the entire dataset and only communication between the master and the workers is allowed) coding has been recently proved to considerably reduce the communication load. This work considers a different communication paradigm referred to as decentralized data shuffling, where workers are allowed to communicate with one another via a shared link. The decentralized data shuffling problem has two phases: workers communicate with each other during the data shuffling phase, and then workers update their stored content during the storage phase. The main challenge is to derive novel converse bounds and achievable schemes for decentralized data shuffling by considering the asymmetry of the workers’ storages (i.e., workers are constrained to store different files in their storages based on the problem setting), in order to characterize the fundamental limits of this problem. For the case of uncoded storage (i.e., each worker directly stores a subset of bits of the dataset), this paper proposes converse and achievable bounds (based on distributed interference alignment and distributed clique-covering strategies) that are within a factor of 3/2 of one another. The proposed schemes are also exactly optimal under the constraint of uncoded storage for either large storage size or at most four workers in the system. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
*MACHINE learning
*DATA

Details

Language :
English
ISSN :
00189448
Volume :
66
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
143457081
Full Text :
https://doi.org/10.1109/TIT.2020.2966197