Back to Search Start Over

Smoothed Online Classification can be Harder than Batch Classification

Authors :
Raman, Vinod
Subedi, Unique
Tewari, Ambuj
Publication Year :
2024

Abstract

We study online classification under smoothed adversaries. In this setting, at each time point, the adversary draws an example from a distribution that has a bounded density with respect to a fixed base measure, which is known apriori to the learner. For binary classification and scalar-valued regression, previous works \citep{haghtalab2020smoothed, block2022smoothed} have shown that smoothed online learning is as easy as learning in the iid batch setting under PAC model. However, we show that smoothed online classification can be harder than the iid batch classification when the label space is unbounded. In particular, we construct a hypothesis class that is learnable in the iid batch setting under the PAC model but is not learnable under the smoothed online model. Finally, we identify a condition that ensures that the PAC learnability of a hypothesis class is sufficient for its smoothed online learnability.<br />Comment: 18 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.15424
Document Type :
Working Paper