Back to Search Start Over

Generalization Error Bounds via Rényi-, ƒ-Divergences and Maximal Leakage.

Authors :
Esposito, Amedeo Roberto
Gastpar, Michael
Issa, Ibrahim
Source :
IEEE Transactions on Information Theory. Aug2021, Vol. 67 Issue 8, p4986-5004. 19p.
Publication Year :
2021

Abstract

In this work, the probability of an event under some joint distribution is bounded by measuring it with the product of the marginals instead (which is typically easier to analyze) together with a measure of the dependence between the two random variables. These results find applications in adaptive data analysis, where multiple dependencies are introduced and in learning theory, where they can be employed to bound the generalization error of a learning algorithm. Bounds are given in terms of Sibson’s Mutual Information, α-Divergences, Hellinger Divergences, and ƒ-Divergences. A case of particular interest is the Maximal Leakage (or Sibson’s Mutual Information of order infinity), since this measure is robust to post-processing and composes adaptively. The corresponding bound can be seen as a generalization of classical bounds, such as Hoeffding’s and McDiarmid’s inequalities, to the case of dependent random variables. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
67
Issue :
8
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
153068444
Full Text :
https://doi.org/10.1109/TIT.2021.3085190