Back to Search Start Over

Optimal forgetting: Semantic compression of episodic memories

Authors :
Balázs Török
David G. Nagy
Gergő Orbán
Source :
PLoS Computational Biology, Vol 16, Iss 10, p e1008367 (2020), PLoS Computational Biology
Publication Year :
2020
Publisher :
Cold Spring Harbor Laboratory, 2020.

Abstract

It has extensively been documented that human memory exhibits a wide range of systematic distortions, which have been associated with resource constraints. Resource constraints on memory can be formalised in the normative framework of lossy compression, however traditional lossy compression algorithms result in qualitatively different distortions to those found in experiments with humans. We argue that the form of distortions is characteristic of relying on a generative model adapted to the environment for compression. We show that this semantic compression framework can provide a unifying explanation of a wide variety of memory phenomena. We harness recent advances in learning deep generative models, that yield powerful tools to approximate generative models of complex data. We use three datasets, chess games, natural text, and hand-drawn sketches, to demonstrate the effects of semantic compression on memory performance. Our model accounts for memory distortions related to domain expertise, gist-based distortions, contextual effects, and delayed recall.<br />Author summary Human memory performs surprisingly poorly in many everyday tasks, which have been richly documented in laboratory experiments. While constraints on memory resources necessarily imply a loss of information, it is possible to do well or badly in relation to available memory resources. In this paper we recruit information theory, which establishes how to optimally lose information based on prior and complete knowledge of environmental statistics. For this, we address two challenges. 1, The environmental statistics is not known for the brain, rather these have to be learned over time from limited observations. 2, Information theory does not specify how different distortions of original experiences should be penalised. In this paper we tackle these challenges by assuming that a latent variable generative model of the environment is maintained in semantic memory. We show that compression of experiences through a generative model gives rise to systematic distortions that qualitatively correspond to a diverse range of observations in the experimental literature.

Details

Database :
OpenAIRE
Journal :
PLoS Computational Biology, Vol 16, Iss 10, p e1008367 (2020), PLoS Computational Biology
Accession number :
edsair.doi.dedup.....dba072e5597fc60974117b78909a40a2