Back to Search Start Over

Representation of semantic information in ventral areas during encoding is associated with improved visual short-term memory

Authors :
Rhodri Cusack
Bobby Stojanoski
Stephen M. Emrich
Publication Year :
2019
Publisher :
Cold Spring Harbor Laboratory, 2019.

Abstract

We rely upon visual short-term memory (VSTM) for continued access to perceptual information that is no longer available. Despite the complexity of our visual environments, the majority of research on VSTM has focused on memory for lower-level perceptual features. Using more naturalistic stimuli, it has been found that recognizable objects are remembered better than unrecognizable objects. What remains unclear, however, is how semantic information changes brain representations in order to facilitate this improvement in VSTM for real-world objects. To address this question, we used a continuous report paradigm to assess VSTM (precision and guessing rate) while participants underwent functional magnetic resonance imaging (fMRI) to measure the underlying neural representation of 96 objects from 4 animate and 4 inanimate categories. To isolate semantic content, we used a novel image generation method that parametrically warps images until they are no longer recognizable while preserving basic visual properties. We found that intact objects were remembered with greater precision and a lower guessing rate than unrecognizable objects (this also emerged when objects were grouped by category and animacy). Representational similarity analysis of the ventral visual stream found evidence of category and animacy information in anterior visual areas during encoding only, but not during maintenance. These results suggest that the effect of semantic information during encoding in ventral visual areas boosts visual short-term memory for real-world objects.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....7ebd78ac57ce86334ad9abd0a279cae6
Full Text :
https://doi.org/10.1101/2019.12.13.875542