1. Stochastic Variance Reduced Gradient Method Embedded with Positive Defined Stabilized Barzilai-Borwein.
- Author
-
Weijuan Shi, Shuib, Adibah, and Alwadood, Zuraida
- Subjects
- *
COGNITIVE computing , *COGNITIVE learning , *PROBLEM solving , *FRACTIONAL programming , *MACHINE learning , *ALGORITHMS - Abstract
Machine learning (ML) is evolving rapidly and has made many theoretical breakthroughs while widely applied in various fields. ML allows systems the ability to access data and use it to enable computers to execute cognitive processes such as learning and improving from previous experiences and solving complicated issues. Many first-order stochastic optimization methods have been used to solve the optimization model of ML. These algorithms adopt Barzilai-Borwein (BB) step size instead of fixed or diminishing step size to improve performance. However, the BB step size format involves fractional calculation, which inevitably leads to a zero denominator, especially when the objective function is non-convex. The BB technique will be violated if the denominator is near 0 or even negative. To improve the computation of the step size, a Positive Defined Stabilized Barzilai-Borwein (PDSBB) approach is introduced in this paper. Integrating PDSBB with the stochastic variance reduced gradient (SVRG) approach, a new method SVRG-PDSBB is proposed. Numerical experiments have shown that the new algorithm has stabilized the performance of the new step size, which successfully avoiding zero denominators and effectively solving the common problems in machine learning. The convergence of SVRG-PDSBB is theoretically and numerically proven, and the effectiveness of the new algorithm is shown by comparison with other algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023