Back to Search
Start Over
Convergence guarantees for forward gradient descent in the linear regression model
- Source :
- Journal of Statistical Planning and Inference, Volume 233, 106174, 2024
- Publication Year :
- 2023
-
Abstract
- Renewed interest in the relationship between artificial and biological neural networks motivates the study of gradient-free methods. Considering the linear regression model with random design, we theoretically analyze in this work the biologically motivated (weight-perturbed) forward gradient scheme that is based on random linear combination of the gradient. If d denotes the number of parameters and k the number of samples, we prove that the mean squared error of this method converges for $k\gtrsim d^2\log(d)$ with rate $d^2\log(d)/k.$ Compared to the dimension dependence d for stochastic gradient descent, an additional factor $d\log(d)$ occurs.<br />Comment: 17 pages
Details
- Database :
- arXiv
- Journal :
- Journal of Statistical Planning and Inference, Volume 233, 106174, 2024
- Publication Type :
- Report
- Accession number :
- edsarx.2309.15001
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1016/j.jspi.2024.106174