Back to Search
Start Over
High Probability Bounds for Stochastic Subgradient Schemes with Heavy Tailed Noise
- Publication Year :
- 2022
-
Abstract
- In this work we study high probability bounds for stochastic subgradient methods under heavy tailed noise. In this setting the noise is only assumed to have finite variance as opposed to a sub-Gaussian distribution for which it is known that standard subgradient methods enjoys high probability bounds. We analyzed a clipped version of the projected stochastic subgradient method, where subgradient estimates are truncated whenever they have large norms. We show that this clipping strategy leads both to near optimal any-time and finite horizon bounds for many classical averaging schemes. Preliminary experiments are shown to support the validity of the method.<br />Comment: 39 pages
- Subjects :
- Mathematics - Optimization and Control
Statistics - Machine Learning
90C25, 62L20
Subjects
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2208.08567
- Document Type :
- Working Paper