Back to Search Start Over

Lion Cub: Minimizing Communication Overhead in Distributed Lion

Authors :
Ishikawa, Satoki
Ben-Nun, Tal
Van Essen, Brian
Yokota, Rio
Dryden, Nikoli
Publication Year :
2024

Abstract

Communication overhead is a key challenge in distributed deep learning, especially on slower Ethernet interconnects, and given current hardware trends, communication is likely to become a major bottleneck. While gradient compression techniques have been explored for SGD and Adam, the Lion optimizer has the distinct advantage that its update vectors are the output of a sign operation, enabling straightforward quantization. However, simply compressing updates for communication and using techniques like majority voting fails to lead to end-to-end speedups due to inefficient communication algorithms and reduced convergence. We analyze three factors critical to distributed learning with Lion: optimizing communication methods, identifying effective quantization methods, and assessing the necessity of momentum synchronization. Our findings show that quantization techniques adapted to Lion and selective momentum synchronization can significantly reduce communication costs while maintaining convergence. We combine these into Lion Cub, which enables up to 5x speedups in end-to-end training compared to Lion. This highlights Lion's potential as a communication-efficient solution for distributed training.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2411.16462
Document Type :
Working Paper