Back to Search
Start Over
Accelerating Distributed Learning in Non-Dedicated Environments
- Source :
- IEEE Transactions on Cloud Computing. 11:515-531
- Publication Year :
- 2023
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2023.
-
Abstract
- Machine learning (ML) models are increasingly trained with distributed workers possessing heterogeneous resources. In such scenarios, model training efficiency may be negatively affected by \emph{stragglers}---workers that run much slower than others. Efficient model training requires eliminating such stragglers, yet for modern ML workloads, existing load balancing strategies are inefficient and even infeasible. In this paper, we propose a novel strategy, called \emph{semi-dynamic load balancing}, to eliminate stragglers of distributed ML workloads. The key insight is that ML workers shall be load-balanced at \emph{iteration boundaries}, being non-intrusive to intra-iteration execution. Based on it we further develop LB-BSP, an integrated worker coordination mechanism that adapts workers' load to their instantaneous processing capabilities---by right-sizing the sample batches at the synchronization barriers. We have designed distinct load tuning algorithms for ML in CPU clusters, in GPU clusters as well as in federated learning setups, based on their respective characteristics. LB-BSP has been implemented as a Python module for ML frameworks like TensorFlow and PyTorch. Our EC2 deployment confirms that LB-BSP is practical, effective and light-weight, and is able to accelerating distributed training by up to $54\%$ .
- Subjects :
- Computer Networks and Communications
Computer science
business.industry
Distributed computing
Cloud computing
Sample (statistics)
Python (programming language)
Load balancing (computing)
Computer Science Applications
Load management
Hardware and Architecture
Software deployment
Synchronization (computer science)
Key (cryptography)
business
computer
Software
Information Systems
computer.programming_language
Subjects
Details
- ISSN :
- 23720018
- Volume :
- 11
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Cloud Computing
- Accession number :
- edsair.doi...........9cdb9a572cef23fdda3d490e6d6ada56