Back to Search Start Over

SOFIM: Stochastic Optimization Using Regularized Fisher Information Matrix

Authors :
Sen, Mrinmay
Qin, A. K.
C, Gayathri
N, Raghu Kishore
Chen, Yen-Wei
Raman, Balasubramanian
Publication Year :
2024

Abstract

This paper introduces a new stochastic optimization method based on the regularized Fisher information matrix (FIM), named SOFIM, which can efficiently utilize the FIM to approximate the Hessian matrix for finding Newton's gradient update in large-scale stochastic optimization of machine learning models. It can be viewed as a variant of natural gradient descent, where the challenge of storing and calculating the full FIM is addressed through making use of the regularized FIM and directly finding the gradient update direction via Sherman-Morrison matrix inversion. Additionally, like the popular Adam method, SOFIM uses the first moment of the gradient to address the issue of non-stationary objectives across mini-batches due to heterogeneous data. The utilization of the regularized FIM and Sherman-Morrison matrix inversion leads to the improved convergence rate with the same space and time complexities as stochastic gradient descent (SGD) with momentum. The extensive experiments on training deep learning models using several benchmark image classification datasets demonstrate that the proposed SOFIM outperforms SGD with momentum and several state-of-the-art Newton optimization methods in term of the convergence speed for achieving the pre-specified objectives of training and test losses as well as test accuracy.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.02833
Document Type :
Working Paper