Back to Search Start Over

DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient Distributed Learning

Authors :
Guangfeng Yan
Shao-Lun Huang
Tian Lan
Linqi Song
Source :
2021 IEEE 18th International Conference on Mobile Ad Hoc and Smart Systems (MASS).
Publication Year :
2021
Publisher :
IEEE, 2021.

Abstract

Gradient quantization is an emerging technique in reducing communication costs in distributed learning. Existing gradient quantization algorithms often rely on engineering heuristics or empirical observations, lacking a systematic approach to dynamically quantize gradients. This paper addresses this issue by proposing a novel dynamically quantized SGD (DQ-SGD) framework, enabling us to dynamically adjust the quantization scheme for each gradient descent step by exploring the trade-off between communication cost and convergence error. We derive an upper bound, tight in some cases, of the convergence error for a restricted family of quantization schemes and loss functions. We design our DQ-SGD algorithm via minimizing the communication cost under the convergence error constraints. Finally, through extensive experiments on large-scale natural language processing and computer vision tasks on AG-News, CIFAR-10, and CIFAR-100 datasets, we demonstrate that our quantization scheme achieves better tradeoffs between the communication cost and learning performance than other state-of-the-art gradient quantization methods.<br />Comment: 10 pages, 7 figures

Details

Database :
OpenAIRE
Journal :
2021 IEEE 18th International Conference on Mobile Ad Hoc and Smart Systems (MASS)
Accession number :
edsair.doi.dedup.....98e7c68365946a48fbc86517d34be491