Back to Search Start Over

Perceptual Learning of Noise-Vocoded Speech Under Divided Attention.

Authors :
Han Wang
Rongru Chen
Yu Yan
McGettigan, Carolyn
Rosen, Stuart
Adank, Patti
Source :
Trends in Hearing; Jan-Dec2023, Vol. 27, p1-17, 17p
Publication Year :
2023

Abstract

Speech perception performance for degraded speech can improve with practice or exposure. Such perceptual learning is thought to be reliant on attention and theoretical accounts like the predictive coding framework suggest a key role for attention in supporting learning. However, it is unclear whether speech perceptual learning requires undivided attention. We evaluated the role of divided attention in speech perceptual learning in two online experiments (N=336). Experiment 1 tested the reliance of perceptual learning on undivided attention. Participants completed a speech recognition task where they repeated forty noise-vocoded sentences in a between-group design. Participants performed the speech task alone or concurrently with a domain-general visual task (dual task) at one of three difficulty levels. We observed perceptual learning under divided attention for all four groups, moderated by dual-task difficulty. Listeners in easy and intermediate visual conditions improved as much as the single-task group. Those who completed the most challenging visual task showed faster learning and achieved similar ending performance compared to the single-task group. Experiment 2 tested whether learning relies on domain-specific or domain-general processes. Participants completed a single speech task or performed this task together with a dual task aiming to recruit domain-specific (lexical or phonological), or domain-general (visual) processes. All secondary task conditions produced patterns and amount of learning comparable to the single speech task. Our results demonstrate that the impact of divided attention on perceptual learning is not strictly dependent on domain-general or domain-specific processes and speech perceptual learning persists under divided attention. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
23312165
Volume :
27
Database :
Complementary Index
Journal :
Trends in Hearing
Publication Type :
Academic Journal
Accession number :
175217480
Full Text :
https://doi.org/10.1177/23312165231192297