Back to Search Start Over

Perceptual learning evidence for supramodal representation of stimulus orientation at a conceptual level

Authors :
Cong Yu
Kai Wen
Ding-Zhi Hu
Lihan Chen
Source :
Vision Research. 187:120-128
Publication Year :
2021
Publisher :
Elsevier BV, 2021.

Abstract

When stimulus inputs from different senses are integrated to form a coherent percept, inputs from a more precise sense are typically more dominant than those from a less precise sense. Furthermore, we hypothesized that some basic stimulus features, such as orientation, can be supramodal-represented at a conceptual level that is independent of the original modality precision. This hypothesis was tested with perceptual learning experiments. Specifically, participants practiced coarser tactile orientation discrimination, which initially had little impact on finer visual orientation discrimination (tactile vs. visual orientation thresholds = 3:1). However, if participants also practiced a functionally orthogonal visual contrast discrimination task in a double training design, their visual orientation performance was improved at both tactile-trained and untrained orientations, as much as through direct visual orientation training. The complete tactile-to-visual learning transfer is consistent with a conceptual supramodal representation of orientation unconstrained by original modality precision, likely through certain forms of input standardization. Moreover, this conceptual supramodal representation, when improved through perceptual learning in one sense, can in turn facilitate orientation discrimination in an untrained sense.

Details

ISSN :
00426989
Volume :
187
Database :
OpenAIRE
Journal :
Vision Research
Accession number :
edsair.doi.dedup.....0b27156b40d7ea7c0f565bd0347bce43