Back to Search
Start Over
Online Gradient Descent Learning Algorithms
- Source :
- Foundations of Computational Mathematics. 8:561-596
- Publication Year :
- 2007
- Publisher :
- Springer Science and Business Media LLC, 2007.
-
Abstract
- This paper considers the least-square online gradient descent algorithm in a reproducing kernel Hilbert space (RKHS) without an explicit regularization term. We present a novel capacity independent approach to derive error bounds and convergence results for this algorithm. The essential element in our analysis is the interplay between the generalization error and a weighted cumulative error which we define in the paper. We show that, although the algorithm does not involve an explicit RKHS regularization term, choosing the step sizes appropriately can yield competitive error rates with those in the literature.
- Subjects :
- Early stopping
Applied Mathematics
Numerical analysis
Online machine learning
Regularization perspectives on support vector machines
Regularization (mathematics)
Support vector machine
Computational Mathematics
Computational Theory and Mathematics
Gradient descent
Algorithm
Analysis
Reproducing kernel Hilbert space
Mathematics
Subjects
Details
- ISSN :
- 16153383 and 16153375
- Volume :
- 8
- Database :
- OpenAIRE
- Journal :
- Foundations of Computational Mathematics
- Accession number :
- edsair.doi...........7bd929074abc4741ec6664cdf75a2580
- Full Text :
- https://doi.org/10.1007/s10208-006-0237-y