Back to Search Start Over

Vision of tongue movements bias auditory speech perception

Authors :
Jeffrey Berry
Luciano Fadiga
Laura Maffongelli
Eleonora Bartoli
Alessandro D'Ausilio
Source :
Neuropsychologia. 63:85-91
Publication Year :
2014
Publisher :
Elsevier BV, 2014.

Abstract

Audiovisual speech perception is likely based on the association between auditory and visual informa- tion into stable audiovisual maps. Conflicting audiovisual inputs generate perceptual illusions such as the McGurk effect. Audiovisual mismatch effects could be either driven by the detection of violations in the standard audiovisual statistics or via the sensorimotor reconstruction of the distal articulatory event that generated the audiovisual ambiguity. In order to disambiguate between the two hypotheses we exploit the fact that the tongue is hidden to vision. For this reason, tongue movement encoding can solely be learned via speech production but not via others' speech perception alone. Here we asked participants to identify speech sounds while matching or mismatching visual representations of tongue movements which were shown. Vision of congruent tongue movements facilitated auditory speech identification with respect to incongruent trials. This result suggests that direct visual experience of an articulator movement is not necessary for the generation of audiovisual mismatch effects. Furthermore, we suggest that audiovisual integration in speech may benefit from speech production learning.

Details

ISSN :
00283932
Volume :
63
Database :
OpenAIRE
Journal :
Neuropsychologia
Accession number :
edsair.doi.dedup.....e20a67dfd8a53634a5bcfb933fc643ba