Back to Search
Start Over
Stereosonic vision: Exploring visual-to-auditory sensory substitution mappings in an immersive virtual reality navigation paradigm
- Source :
- PLoS ONE, PLoS ONE, Vol 13, Iss 7, p e0199389 (2018)
- Publication Year :
- 2017
-
Abstract
- Sighted people predominantly use vision to navigate spaces, and sight loss has negative consequences for independent navigation and mobility. The recent proliferation of devices that can extract 3D spatial information from visual scenes opens up the possibility of using such mobility-relevant information to assist blind and visually impaired people by presenting this information through modalities other than vision. In this work, we present two new methods for encoding visual scenes using spatial audio: simulated echolocation and distance-dependent hum volume modulation. We implemented both methods in a virtual reality (VR) environment and tested them using a 3D motion-tracking device. This allowed participants to physically walk through virtual mobility scenarios, generating data on real locomotion behaviour. Blindfolded sighted participants completed two tasks: maze navigation and obstacle avoidance. Results were measured against a visual baseline in which participants performed the same two tasks without blindfolds. Task completion time, speed and number of collisions were used as indicators of successful navigation, with additional metrics exploring detailed dynamics of performance. In both tasks, participants were able to navigate using only audio information after minimal instruction. While participants were 65% slower using audio compared to the visual baseline, they reduced their audio navigation time by an average 21% over just 6 trials. Hum volume modulation proved over 20% faster than simulated echolocation in both mobility scenarios, and participants also showed the greatest improvement with this sonification method. Nevertheless, we do speculate that simulated echolocation remains worth exploring as it provides more spatial detail and could therefore be more useful in more complex environments. The fact that participants were intuitively able to successfully navigate space with two new visual-to-audio mappings for conveying spatial information motivates the further exploration of these and other mappings with the goal of assisting blind and visually impaired individuals with independent mobility.
- Subjects :
- Male
Man-Computer Interface
Visual perception
Computer science
Physiology
Vision
Sensory Physiology
lcsh:Medicine
Social Sciences
Blindness
Mechanical Treatment of Specimens
Computer Architecture
User-Computer Interface
0302 clinical medicine
Hearing
Human–computer interaction
Medicine and Health Sciences
Psychology
Audio Equipment
lcsh:Science
Animal Signaling and Communication
Virtual mobility
Visual Impairments
Brain Mapping
Multidisciplinary
Animal Behavior
05 social sciences
Virtual Reality
Sensory Systems
Sensory substitution
Auditory System
Auditory Perception
Visual Perception
Engineering and Technology
Female
Sensory Perception
Research Article
Animal Navigation
Adult
Computer and Information Sciences
Equipment
Human echolocation
Virtual reality
Research and Analysis Methods
050105 experimental psychology
03 medical and health sciences
Young Adult
Obstacle avoidance
Immersion (virtual reality)
Humans
0501 psychology and cognitive sciences
Maze Learning
Vision, Ocular
Behavior
Sonification
lcsh:R
Biology and Life Sciences
Ophthalmology
Specimen Preparation and Treatment
Space Perception
Echolocation
Human Factors Engineering
lcsh:Q
Animal Migration
Zoology
030217 neurology & neurosurgery
Software
Headphones
Neuroscience
User Interfaces
Subjects
Details
- ISSN :
- 19326203
- Volume :
- 13
- Issue :
- 7
- Database :
- OpenAIRE
- Journal :
- PloS one
- Accession number :
- edsair.doi.dedup.....2d6c2f469fde22043401372031762026