Back to Search Start Over

Martingale Methods for Sequential Estimation of Convex Functionals and Divergences

Authors :
Manole, Tudor
Ramdas, Aaditya
Publication Year :
2021

Abstract

We present a unified technique for sequential estimation of convex divergences between distributions, including integral probability metrics like the kernel maximum mean discrepancy, $\varphi$-divergences like the Kullback-Leibler divergence, and optimal transport costs, such as powers of Wasserstein distances. This is achieved by observing that empirical convex divergences are (partially ordered) reverse submartingales with respect to the exchangeable filtration, coupled with maximal inequalities for such processes. These techniques appear to be complementary and powerful additions to the existing literature on both confidence sequences and convex divergences. We construct an offline-to-sequential device that converts a wide array of existing offline concentration inequalities into time-uniform confidence sequences that can be continuously monitored, providing valid tests or confidence intervals at arbitrary stopping times. The resulting sequential bounds pay only an iterated logarithmic price over the corresponding fixed-time bounds, retaining the same dependence on problem parameters (like dimension or alphabet size if applicable). These results are also applicable to more general convex functionals -- like the negative differential entropy, suprema of empirical processes, and V-Statistics -- and to more general processes satisfying a key leave-one-out property.<br />Comment: To appear in the IEEE Transactions on Information Theory

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2103.09267
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/TIT.2023.3250099