Back to Search
Start Over
When is Memorization of Irrelevant Training Data Necessary for High-Accuracy Learning?
- Source :
- STOC
- Publication Year :
- 2020
- Publisher :
- arXiv, 2020.
-
Abstract
- Modern machine learning models are complex and frequently encode surprising amounts of information about individual inputs. In extreme cases, complex models appear to memorize entire input examples, including seemingly irrelevant information (social security numbers from text, for example). In this paper, we aim to understand whether this sort of memorization is necessary for accurate learning. We describe natural prediction problems in which every sufficiently accurate training algorithm must encode, in the prediction model, essentially all the information about a large subset of its training examples. This remains true even when the examples are high-dimensional and have entropy much higher than the sample size, and even when most of that information is ultimately irrelevant to the task at hand. Further, our results do not depend on the training algorithm or the class of models used for learning. Our problems are simple and fairly natural variants of the next-symbol prediction and the cluster labeling tasks. These tasks can be seen as abstractions of text- and image-related prediction problems. To establish our results, we reduce from a family of one-way communication problems for which we prove new information complexity lower bounds. Additionally, we present synthetic-data experiments demonstrating successful attacks on logistic regression and neural network classifiers.
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Class (computer programming)
business.industry
Computer science
0102 computer and information sciences
010501 environmental sciences
ENCODE
Machine learning
computer.software_genre
01 natural sciences
Memorization
Task (project management)
Machine Learning (cs.LG)
010201 computation theory & mathematics
Simple (abstract algebra)
Cluster labeling
sort
Entropy (information theory)
Artificial intelligence
business
computer
0105 earth and related environmental sciences
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- STOC
- Accession number :
- edsair.doi.dedup.....3e37f2a89a8dddcb3c8b40755096da4b
- Full Text :
- https://doi.org/10.48550/arxiv.2012.06421