Back to Search Start Over

Efficient Discrepancy Testing for Learning with Distribution Shift

Authors :
Chandrasekaran, Gautam
Klivans, Adam R.
Kontonis, Vasilis
Stavropoulos, Konstantinos
Vasilyan, Arsen
Publication Year :
2024

Abstract

A fundamental notion of distance between train and test distributions from the field of domain adaptation is discrepancy distance. While in general hard to compute, here we provide the first set of provably efficient algorithms for testing localized discrepancy distance, where discrepancy is computed with respect to a fixed output classifier. These results imply a broad set of new, efficient learning algorithms in the recently introduced model of Testable Learning with Distribution Shift (TDS learning) due to Klivans et al. (2023). Our approach generalizes and improves all prior work on TDS learning: (1) we obtain universal learners that succeed simultaneously for large classes of test distributions, (2) achieve near-optimal error rates, and (3) give exponential improvements for constant depth circuits. Our methods further extend to semi-parametric settings and imply the first positive results for low-dimensional convex sets. Additionally, we separate learning and testing phases and obtain algorithms that run in fully polynomial time at test time.<br />Comment: 45 pages, 3 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.09373
Document Type :
Working Paper