Back to Search Start Over

Heterogeneity-Aware Gradient Coding for Tolerating and Leveraging Stragglers.

Authors :
Wang, Haozhao
Guo, Song
Tang, Bin
Li, Ruixuan
Yang, Yutong
Qu, Zhihao
Wang, Yi
Source :
IEEE Transactions on Computers; Apr2022, Vol. 71 Issue 4, p779-794, 16p
Publication Year :
2022

Abstract

Distributed gradient descent has been widely adopted in the machine learning field because considerable computing resources are available when facing the huge volume of data. Specifically, the gradient over the whole data is cooperatively computed by multiple workers. However, its performance can be severely affected by slow workers, namely stragglers. Recently, coding-based approaches have been introduced to mitigate the straggler problem, but they could hardly deal with the heterogeneity among workers. Besides, they always discard the results of stragglers causing huge resource waste. In this article, we first investigate how to tolerate stragglers by discarding their results and then seek to leverage the stragglers. For tolerating stragglers, we propose a heterogeneity-aware coding scheme that encodes gradients adaptive to the computing capability of workers. Theoretically, this scheme is optimal for stragglers tolerance. Relying on the scheme, we further propose an algorithm called DHeter-aware to exploit the gradients of stragglers which we called delayed gradients. Moreover, theoretical results characterized for DHeter-aware exhibits the same convergence rate as the gradient descent without delayed gradients. Experiments on various tasks and clusters demonstrate that our coding scheme outperforms all the state-of-the-art methods and the DHeter-aware further accelerates the coding scheme by achieving 25 percent time savings. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189340
Volume :
71
Issue :
4
Database :
Complementary Index
Journal :
IEEE Transactions on Computers
Publication Type :
Academic Journal
Accession number :
155774183
Full Text :
https://doi.org/10.1109/TC.2021.3063180