Back to Search Start Over

On Gradient Coding With Partial Recovery

Authors :
Sahasrajit Sarmasarkar
V. Lalitha
Nikhil Karamchandani
Source :
ISIT
Publication Year :
2023
Publisher :
Institute of Electrical and Electronics Engineers (IEEE), 2023.

Abstract

We consider a generalization of the recently proposed gradient coding framework where a large dataset is divided across $n$ workers and each worker transmits to a master node one or more linear combinations of the gradients over the data subsets assigned to it. Unlike the conventional framework which requires the master node to recover the sum of the gradients over all the data subsets in the presence of $s$ straggler workers, we relax the goal of the master node to computing the sum of at least some α fraction of the gradients. The broad goal of our work is to study the optimal computation and communication load per worker for this approximate gradient coding framework. We begin by deriving a lower bound on the computation load of any feasible scheme and also propose a strategy which achieves this lower bound, albeit at the cost of high communication load and a number of data partitions which can be polynomial in the number of workers n. We then restrict attention to schemes which utilize a number of data partitions equal to $n$ and propose schemes based on cyclic assignment which have a lower communication load. When each worker transmits a single linear combination, we also prove lower bounds on the computation load of any scheme using $n$ data partitions. A full version of this paper is accessible at: https://arxiv.org/abs/2102.10163

Details

ISSN :
15580857 and 00906778
Volume :
71
Database :
OpenAIRE
Journal :
IEEE Transactions on Communications
Accession number :
edsair.doi.dedup.....d7a45a57ab780f8f43778040344848c3