Back to Search Start Over

Semantically congruent audiovisual integration with modal-based attention accelerates auditory short-term memory retrieval

Authors :
Hongtao, Yu
Aijun, Wang
Ming, Zhang
JiaJia, Yang
Satoshi, Takahashi
Yoshimichi, Ejima
Jinglong, Wu
Source :
Attention, Perception, & Psychophysics. 84:1625-1634
Publication Year :
2022
Publisher :
Springer Science and Business Media LLC, 2022.

Abstract

Evidence has shown that multisensory integration benefits to unisensory perception performance are asymmetric and that auditory perception performance can receive more multisensory benefits, especially when the attention focus is directed toward a task-irrelevant visual stimulus. At present, whether the benefits of semantically (in)congruent multisensory integration with modal-based attention for subsequent unisensory short-term memory (STM) retrieval are also asymmetric remains unclear. Using a delayed matching-to-sample paradigm, the present study investigated this issue by manipulating the attention focus during multisensory memory encoding. The results revealed that both visual and auditory STM retrieval reaction times were faster under semantically congruent multisensory conditions than under unisensory memory encoding conditions. We suggest that coherent multisensory representation formation might be optimized by restricted multisensory encoding and can be rapidly triggered by subsequent unisensory memory retrieval demands. Crucially, auditory STM retrieval is exclusively accelerated by semantically congruent multisensory memory encoding, indicating that the less effective sensory modality of memory retrieval relies more on the coherent prior formation of a multisensory representation optimized by modal-based attention.

Details

ISSN :
1943393X and 19433921
Volume :
84
Database :
OpenAIRE
Journal :
Attention, Perception, & Psychophysics
Accession number :
edsair.doi.dedup.....d1e45875c2d0319def61641bb118d1c1