Back to Search
Start Over
Online Learning Under a Separable Stochastic Approximation Framework
- Source :
- IEEE Transactions on Pattern Analysis and Machine Intelligence; February 2025, Vol. 47 Issue: 2 p1317-1330, 14p
- Publication Year :
- 2025
-
Abstract
- We propose an online learning algorithm tailored for a class of machine learning models within a separable stochastic approximation framework. The central idea of our approach is to exploit the inherent separability in many models, recognizing that certain parameters are easier to optimize than others. This paper focuses on models where some parameters exhibit linear characteristics, which are common in machine learning applications. In our proposed algorithm, the linear parameters are updated using the recursive least squares (RLS) algorithm, akin to a stochastic Newton method. Subsequently, based on these updated linear parameters, the nonlinear parameters are adjusted using the stochastic gradient method (SGD). This dual-update mechanism can be viewed as a stochastic approximation variant of block coordinate gradient descent, where one subset of parameters is optimized using a second-order method while the other is handled with a first-order approach. We establish the global convergence of our online algorithm for non-convex cases in terms of the expected violation of first-order optimality conditions. Numerical experiments demonstrate that our method achieves significantly faster initial convergence and produces more robust performance compared to other popular learning algorithms. Additionally, our algorithm exhibits reduced sensitivity to learning rates and outperforms the recently proposed <monospace>slimTrain</monospace> algorithm (Newman et al. 2022). For validation, the code has been made available on GitHub.
Details
- Language :
- English
- ISSN :
- 01628828
- Volume :
- 47
- Issue :
- 2
- Database :
- Supplemental Index
- Journal :
- IEEE Transactions on Pattern Analysis and Machine Intelligence
- Publication Type :
- Periodical
- Accession number :
- ejs68607185
- Full Text :
- https://doi.org/10.1109/TPAMI.2024.3495783