Back to Search Start Over

Stop Wasting My Gradients: Practical SVRG

Authors :
Babanezhad, Reza
Ahmed, Mohamed Osama
Virani, Alim
Schmidt, Mark
Konečný, Jakub
Sallinen, Scott
Publication Year :
2015

Abstract

We present and analyze several strategies for improving the performance of stochastic variance-reduced gradient (SVRG) methods. We first show that the convergence rate of these methods can be preserved under a decreasing sequence of errors in the control variate, and use this to derive variants of SVRG that use growing-batch strategies to reduce the number of gradient calculations required in the early iterations. We further (i) show how to exploit support vectors to reduce the number of gradient computations in the later iterations, (ii) prove that the commonly-used regularized SVRG iteration is justified and improves the convergence rate, (iii) consider alternate mini-batch selection strategies, and (iv) consider the generalization error of the method.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1511.01942
Document Type :
Working Paper