Back to Search Start Over

Momentum Tracking: Momentum Acceleration for Decentralized Deep Learning on Heterogeneous Data

Authors :
Takezawa, Yuki
Bao, Han
Niwa, Kenta
Sato, Ryoma
Yamada, Makoto
Publication Year :
2022

Abstract

SGD with momentum is one of the key components for improving the performance of neural networks. For decentralized learning, a straightforward approach using momentum is Distributed SGD (DSGD) with momentum (DSGDm). However, DSGDm performs worse than DSGD when the data distributions are statistically heterogeneous. Recently, several studies have addressed this issue and proposed methods with momentum that are more robust to data heterogeneity than DSGDm, although their convergence rates remain dependent on data heterogeneity and deteriorate when the data distributions are heterogeneous. In this study, we propose Momentum Tracking, which is a method with momentum whose convergence rate is proven to be independent of data heterogeneity. More specifically, we analyze the convergence rate of Momentum Tracking in the setting where the objective function is non-convex and the stochastic gradient is used. Then, we identify that it is independent of data heterogeneity for any momentum coefficient $\beta \in [0, 1)$. Through experiments, we demonstrate that Momentum Tracking is more robust to data heterogeneity than the existing decentralized learning methods with momentum and can consistently outperform these existing methods when the data distributions are heterogeneous.<br />Comment: Transactions on Machine Learning Research 2023

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2209.15505
Document Type :
Working Paper