Back to Search Start Over

Fast Asynchronous Parallel Stochastic Gradient Decent

Authors :
Zhao, Shen-Yi
Li, Wu-Jun
Publication Year :
2015

Abstract

Stochastic gradient descent~(SGD) and its variants have become more and more popular in machine learning due to their efficiency and effectiveness. To handle large-scale problems, researchers have recently proposed several parallel SGD methods for multicore systems. However, existing parallel SGD methods cannot achieve satisfactory performance in real applications. In this paper, we propose a fast asynchronous parallel SGD method, called AsySVRG, by designing an asynchronous strategy to parallelize the recently proposed SGD variant called stochastic variance reduced gradient~(SVRG). Both theoretical and empirical results show that AsySVRG can outperform existing state-of-the-art parallel SGD methods like Hogwild! in terms of convergence rate and computation cost.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1508.05711
Document Type :
Working Paper