Back to Search Start Over

Upper Bounds on the Generalization Error of Private Algorithms for Discrete Data.

Authors :
Rodriguez-Galvez, Borja
Bassi, German
Skoglund, Mikael
Source :
IEEE Transactions on Information Theory. Nov2021, Vol. 67 Issue 11, p7362-7379. 18p.
Publication Year :
2021

Abstract

In this work, we study the generalization capability of algorithms from an information-theoretic perspective. It has been shown that the expected generalization error of an algorithm is bounded from above by a function of the relative entropy between the conditional probability distribution of the algorithm’s output hypothesis, given the dataset with which it was trained, and its marginal probability distribution. We build upon this fact and introduce a mathematical formulation to obtain upper bounds on this relative entropy. Assuming that the data is discrete, we then develop a strategy using this formulation, based on the method of types and typicality, to find explicit upper bounds on the generalization error of stable algorithms, i.e., algorithms that produce similar output hypotheses given similar input datasets. In particular, we show the bounds obtained with this strategy for the case of $\epsilon $ -DP and $\mu $ -GDP algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00189448
Volume :
67
Issue :
11
Database :
Academic Search Index
Journal :
IEEE Transactions on Information Theory
Publication Type :
Academic Journal
Accession number :
153710522
Full Text :
https://doi.org/10.1109/TIT.2021.3111480