Back to Search
Start Over
Optimal forgetting: Semantic compression of episodic memories
- Source :
- PLoS Computational Biology, Vol 16, Iss 10, p e1008367 (2020), PLoS Computational Biology
- Publication Year :
- 2020
- Publisher :
- Cold Spring Harbor Laboratory, 2020.
-
Abstract
- It has extensively been documented that human memory exhibits a wide range of systematic distortions, which have been associated with resource constraints. Resource constraints on memory can be formalised in the normative framework of lossy compression, however traditional lossy compression algorithms result in qualitatively different distortions to those found in experiments with humans. We argue that the form of distortions is characteristic of relying on a generative model adapted to the environment for compression. We show that this semantic compression framework can provide a unifying explanation of a wide variety of memory phenomena. We harness recent advances in learning deep generative models, that yield powerful tools to approximate generative models of complex data. We use three datasets, chess games, natural text, and hand-drawn sketches, to demonstrate the effects of semantic compression on memory performance. Our model accounts for memory distortions related to domain expertise, gist-based distortions, contextual effects, and delayed recall.<br />Author summary Human memory performs surprisingly poorly in many everyday tasks, which have been richly documented in laboratory experiments. While constraints on memory resources necessarily imply a loss of information, it is possible to do well or badly in relation to available memory resources. In this paper we recruit information theory, which establishes how to optimally lose information based on prior and complete knowledge of environmental statistics. For this, we address two challenges. 1, The environmental statistics is not known for the brain, rather these have to be learned over time from limited observations. 2, Information theory does not specify how different distortions of original experiences should be penalised. In this paper we tackle these challenges by assuming that a latent variable generative model of the environment is maintained in semantic memory. We show that compression of experiences through a generative model gives rise to systematic distortions that qualitatively correspond to a diverse range of observations in the experimental literature.
- Subjects :
- 0301 basic medicine
Computer science
Social Sciences
computer.software_genre
Machine Learning
Cognition
Learning and Memory
0302 clinical medicine
Sociology
Psychology
Biology (General)
Episodic memory
Semantic compression
Data Management
Ecology
Lossy Compression
Online Encyclopedias
Semantics
Variety (cybernetics)
Generative model
Computational Theory and Mathematics
Modeling and Simulation
Memory Recall
Sensory Perception
Algorithms
Research Article
Computer and Information Sciences
QH301-705.5
Memory, Episodic
Models, Neurological
Data_CODINGANDINFORMATIONTHEORY
Lossy compression
Machine learning
03 medical and health sciences
Cellular and Molecular Neuroscience
Deep Learning
Memory
Artificial Intelligence
Compression (functional analysis)
Genetics
Humans
Learning
Mass Media
Molecular Biology
Ecology, Evolution, Behavior and Systematics
Forgetting
business.industry
Cognitive Psychology
Biology and Life Sciences
Linguistics
Data Compression
Communications
030104 developmental biology
Cognitive Science
Encyclopedias
Perception
Artificial intelligence
business
computer
030217 neurology & neurosurgery
Generative grammar
Neuroscience
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- PLoS Computational Biology, Vol 16, Iss 10, p e1008367 (2020), PLoS Computational Biology
- Accession number :
- edsair.doi.dedup.....dba072e5597fc60974117b78909a40a2