Back to Search Start Over

The regularized stochastic Nesterov's accelerated Quasi-Newton method with applications.

Authors :
Makmuang, Dawrawee
Suppalap, Siwakon
Wangkeeree, Rabian
Source :
Journal of Computational & Applied Mathematics. Aug2023, Vol. 428, pN.PAG-N.PAG. 1p.
Publication Year :
2023

Abstract

The stochastic Broyden–Fletcher–Goldfarb–Shanno (BFGS) method has effectively solved strongly convex optimization problems. However, this method frequently encounters the near-singularity problem of the Hessian. Additionally, obtaining the optimal solution necessitates a long convergence time. In this paper, we present a regularized stochastic Nesterov's accelerated quasi-Newton method that combines Nesterov acceleration with a novel momentum coefficient to effectively accelerate convergence speed and avoid the near-singularity problem of the Hessian update in the stochastic BFGS method. Moreover, we show the almost sure convergence of the generated subsequence of iterates to an optimal solution of the strongly convex optimization problems. We examined our approach to real-world datasets. The experiment results confirmed the effectiveness and superiority of the proposed method compared with other methods in solving classification problems. [Display omitted] • A novel quasi-Newton algorithm for solving strongly convex optimization problems. • A novel momentum coefficient is introduced to accelerate the convergence speed • Classification problems can be solved using the proposed method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
03770427
Volume :
428
Database :
Academic Search Index
Journal :
Journal of Computational & Applied Mathematics
Publication Type :
Academic Journal
Accession number :
162937110
Full Text :
https://doi.org/10.1016/j.cam.2023.115190