65 results on '"Ernst, Marc O."'
Search Results
2. Visual experience shapes the Bouba-Kiki effect and the size-weight illusion upon sight restoration from congenital blindness.
- Author
-
Piller, Sophia, Senna, Irene, and Ernst, Marc O.
- Subjects
BLINDNESS ,CATARACT ,SPEECH ,SOUNDS ,OPTICAL illusions ,SURGERY ,VISUAL perception - Abstract
The Bouba-Kiki effect is the systematic mapping between round/spiky shapes and speech sounds ("Bouba"/"Kiki"). In the size-weight illusion, participants judge the smaller of two equally-weighted objects as being heavier. Here we investigated the contribution of visual experience to the development of these phenomena. We compared three groups: early blind individuals (no visual experience), individuals treated for congenital cataracts years after birth (late visual experience), and typically sighted controls (visual experience from birth). We found that, in cataract-treated participants (tested visually/visuo-haptically), both phenomena are absent shortly after sight onset, just like in blind individuals (tested haptically). However, they emerge within months following surgery, becoming statistically indistinguishable from the sighted controls. This suggests a pivotal role of visual experience and refutes the existence of an early sensitive period: A short period of experience, even when gained only years after birth, is sufficient for participants to visually pick-up regularities in the environment, contributing to the development of these phenomena. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. Recalibrating vision-for-action requires years after sight restoration from congenital cataracts.
- Author
-
Senna, Irene, Piller, Sophia, Ben-Zion, Itay, and Ernst, Marc O.
- Published
- 2022
- Full Text
- View/download PDF
4. Complementary interfaces for visual computing.
- Author
-
Zagermann, Johannes, Hubenschmid, Sebastian, Balestrucci, Priscilla, Feuchtner, Tiare, Mayer, Sven, Ernst, Marc O., Schmidt, Albrecht, and Reiterer, Harald
- Subjects
UBIQUITOUS computing ,USER interfaces - Abstract
With increasing complexity in visual computing tasks, a single device may not be sufficient to adequately support the user's workflow. Here, we can employ multi-device ecologies such as cross-device interaction, where a workflow can be split across multiple devices, each dedicated to a specific role. But what makes these multi-device ecologies compelling? Based on insights from our research, each device or interface component must contribute a complementary characteristic to increase the quality of interaction and further support users in their current activity. We establish the term complementary interfaces for such meaningful combinations of devices and modalities and provide an initial set of challenges. In addition, we demonstrate the value of complementarity with examples from within our own research. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. Multisensory correlation computations in the human brain identified by a time-resolved encoding model.
- Author
-
Pesnot Lerousseau, Jacques, Parise, Cesare V., Ernst, Marc O., and van Wassenhove, Virginie
- Subjects
CROWDSOURCING ,CORRELATORS ,CAUSAL inference ,JUDGMENT (Psychology) ,HUMAN behavior - Abstract
Neural mechanisms that arbitrate between integrating and segregating multisensory information are essential for complex scene analysis and for the resolution of the multisensory correspondence problem. However, these mechanisms and their dynamics remain largely unknown, partly because classical models of multisensory integration are static. Here, we used the Multisensory Correlation Detector, a model that provides a good explanatory power for human behavior while incorporating dynamic computations. Participants judged whether sequences of auditory and visual signals originated from the same source (causal inference) or whether one modality was leading the other (temporal order), while being recorded with magnetoencephalography. First, we confirm that the Multisensory Correlation Detector explains causal inference and temporal order behavioral judgments well. Second, we found strong fits of brain activity to the two outputs of the Multisensory Correlation Detector in temporo-parietal cortices. Finally, we report an asymmetry in the goodness of the fits, which were more reliable during the causal inference task than during the temporal order judgment task. Overall, our results suggest the existence of multisensory correlation detectors in the human brain, which explain why and how causal inference is strongly driven by the temporal correlation of multisensory signals. Neural mechanisms that arbitrate between integrating and segregating multisensory information are essential for complex scene analysis. Here, the authors show the existence of multisensory correlation detectors in the human brain which explains why and how causal inference is driven by the temporal correlation of multisensory signals. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
6. Visual pursuit biases tactile velocity perception.
- Author
-
Scotto, Cécile R., Moscatelli, Alessandro, Pfeiffer, Thies, and Ernst, Marc O.
- Abstract
During a smooth pursuit eye movement of a target stimulus, a briefly flashed stationary background appears to move in the opposite direction as the eye’s motion—an effect known as the Filehne illusion. Similar illusions occur in audition, in the vestibular system, and in touch. Recently, we found that the movement of a surface perceived from tactile slip was biased if this surface was sensed with the moving hand. The analogy between these two illusions suggests similar mechanisms of motion processing between the vision and touch. In the present study, we further assessed the interplay between these two sensory channels by investigating a novel paradigm that associated an eye pursuit of a visual target with a tactile motion over the skin of the fingertip. We showed that smooth pursuit eye movements can bias the perceived direction of motion in touch. Similarly to the classical report from the Filehne illusion in vision, a static tactile surface was perceived as moving rightward with a leftward eye pursuit movement, and vice versa. However, this time the direction of surface motion was perceived from touch. The biasing effects of eye pursuit on tactile motion were modulated by the reliability of the tactile and visual stimuli, consistently with a Bayesian model of motion perception. Overall, these results support a modality- and effector-independent process with common representations for motion perception. NEW & NOTEWORTHY The study showed that smooth pursuit eye movement produces a bias in tactile motion perception. This phenomenon is modulated by the reliability of the tactile estimate and by the presence of a visual background, in line with the predictions of the Bayesian framework of motion perception. Overall, these results support the hypothesis of shared representations for motion perception. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Editorial: Spatial and Temporal Perception in Sensory Deprivation.
- Author
-
Senna, Irene, Cuturi, Luigi F., Gori, Monica, Ernst, Marc O., and Cappagli, Giulia
- Subjects
SPACE perception ,SENSORY deprivation ,VISION disorders ,SCIENTIFIC communication ,VISUAL perception ,PEOPLE with visual disabilities ,DEPTH perception ,LOW vision - Abstract
Glick and Sharma demonstrated that early stage mild-moderate age-related hearing loss is associated with cross-modal recruitment of auditory, frontal and prefrontal cortices during visual tasks, suggesting functional changes induced by hearing loss. By highlighting the limitation in visual impairment dedicated tools, the Authors mainly focused on the lack of formal and informal assessment methods, and promoted the validation of large-scale application of newly developed tools in the context of pediatric visual impairment. Sensory deprivation, cortical plasticity, spatial perception, temporal perception, visual impairment, hearing impairment, motor impairment, rehabilitation Keywords: sensory deprivation; cortical plasticity; spatial perception; temporal perception; visual impairment; hearing impairment; motor impairment; rehabilitation EN sensory deprivation cortical plasticity spatial perception temporal perception visual impairment hearing impairment motor impairment rehabilitation N.PAG N.PAG 5 05/17/21 20210330 NES 210330 The Research Topic aimed at providing new insights into the impact of sensory deprivation on spatio-temporal abilities and their subtending cortical circuits. [Extracted from the article]
- Published
- 2021
- Full Text
- View/download PDF
8. Role of Tactile Noise in the Control of Digit Normal Force.
- Author
-
Naceri, Abdeldjallil, Gultekin, Yasemin B., Moscatelli, Alessandro, and Ernst, Marc O.
- Subjects
NOISE control ,MECHANICAL properties of condensed matter ,FINGERS ,THUMB ,OPEN-ended questions - Abstract
Whenever we grasp and lift an object, our tactile system provides important information on the contact location and the force exerted on our skin. The human brain integrates signals from multiple sites for a coherent representation of object shape, inertia, weight, and other material properties. It is still an open question whether the control of grasp force occurs at the level of individual fingers or whether it is also influenced by the control and the signals from the other fingers of the same hand. In this work, we approached this question by asking participants to lift, transport, and replace a sensorized object, using three- and four-digit grasp. Tactile input was altered by covering participant's fingertips with a rubber thimble, which reduced the reliability of the tactile sensory input. In different experimental conditions, we covered between one and three fingers opposing the thumb. Normal forces at each finger and the thumb were recorded while grasping and holding the object, with and without the thimble. Consistently with previous studies, reducing tactile sensitivity increased the overall grasping force. The gasping force increased in the covered finger, whereas it did not change from baseline in the remaining bare fingers (except the thumb for equilibrium constraints). Digit placement and object tilt were not systematically affected by rubber thimble conditions. Our results suggest that, in each finger opposing thumb, digit normal force is controlled locally in response to the applied tactile perturbation. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
9. No need to touch this: Bimanual haptic slant adaptation does not require touch.
- Author
-
Glowania, Catharina, Plaisier, Myrthe A., Ernst, Marc O., and Van Dam, Loes C. J.
- Subjects
PREHENSION (Physiology) ,PHYSICAL contact ,CURVATURE ,POSTURE ,EVERYDAY life - Abstract
In our daily life, we often interact with objects using both hands raising the question the question to what extent information between the hands is shared. It has, for instance, been shown that curvature adaptation aftereffects can transfer from the adapted hand to the non-adapted hand. However, this transfer only occurred for dynamic exploration, e.g. by moving a single finger over a surface, but not for static exploration when keeping static contact with the surface and combining the information from different parts of the hand. This raises the question to what extent adaptation to object shape is shared between the hands when both hands are used in static fashion simultaneously and the object shape estimates require information from both hands. Here we addressed this question in three experiments using a slant adaptation paradigm. In Experiment 1 we investigated whether an aftereffect of static bimanual adaptation occurs at all and whether it transfers to conditions in which one hand was moving. In Experiment 2 participants adapted either to a felt slanted surface or simply be holding their hands in mid-air at similar positions, to investigate to what extent the effects of static bimanual adaptation are posture-based rather than object based. Experiment 3 further explored the idea that bimanual adaptation is largely posture based. We found that bimanual adaptation using static touch did lead to aftereffects when using the same static exploration mode for testing. However, the aftereffect did not transfer to any exploration mode that included a dynamic component. Moreover, we found similar aftereffects both with and without a haptic surface. Thus, we conclude that static bimanual adaptation is of proprioceptive nature and does not occur at the level at which the object is represented. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. Computational principles of neural adaptation for binaural signal integration.
- Author
-
Oess, Timo, Ernst, Marc O., and Neumann, Heiko
- Subjects
NEUROPLASTICITY ,ACOUSTIC localization ,INFERIOR colliculus ,COMPUTATIONAL neuroscience ,SENSE organs ,ARTIFICIAL neural networks ,AUDITORY perception - Abstract
Adaptation to statistics of sensory inputs is an essential ability of neural systems and extends their effective operational range. Having a broad operational range facilitates to react to sensory inputs of different granularities, thus is a crucial factor for survival. The computation of auditory cues for spatial localization of sound sources, particularly the interaural level difference (ILD), has long been considered as a static process. Novel findings suggest that this process of ipsi- and contra-lateral signal integration is highly adaptive and depends strongly on recent stimulus statistics. Here, adaptation aids the encoding of auditory perceptual space of various granularities. To investigate the mechanism of auditory adaptation in binaural signal integration in detail, we developed a neural model architecture for simulating functions of lateral superior olive (LSO) and medial nucleus of the trapezoid body (MNTB) composed of single compartment conductance-based neurons. Neurons in the MNTB serve as an intermediate relay population. Their signal is integrated by the LSO population on a circuit level to represent excitatory and inhibitory interactions of input signals. The circuit incorporates an adaptation mechanism operating at the synaptic level based on local inhibitory feedback signals. The model's predictive power is demonstrated in various simulations replicating physiological data. Incorporating the innovative adaptation mechanism facilitates a shift in neural responses towards the most effective stimulus range based on recent stimulus history. The model demonstrates that a single LSO neuron quickly adapts to these stimulus statistics and, thus, can encode an extended range of ILDs in the ipsilateral hemisphere. Most significantly, we provide a unique measurement of the adaptation efficacy of LSO neurons. Prerequisite of normal function is an accurate interaction of inhibitory and excitatory signals, a precise encoding of time and a well-tuned local feedback circuit. We suggest that the mechanisms of temporal competitive-cooperative interaction and the local feedback mechanism jointly sensitize the circuit to enable a response shift towards contra-lateral and ipsi-lateral stimuli, respectively. Author summary: Why are we more precise in localizing a sound after hearing it several times? Adaptation to the statistics of a stimulus plays a crucial role in this. The present article investigates the abilities of a neural adaptation mechanism for improved localization skills based on a neural network model. Adaptation to stimulus statistics is very prominent in sensory systems of animals and allows them to respond to a wide range of stimuli, thus is a crucial factor for survival. For example, humans are able to navigate under suddenly changing illumination conditions (driving a car into and out of a tunnel). This is possible by courtesy of adaptation abilities of our sensory organs and pathways. Certainly, adaptation is not confined to a single sense like vision but also affects other senses like audition. Especially the perception of sound source location. Compared to vision, the localization of a sound source in the horizontal plane is a rather complicated task since the location cannot be read out from the receptor surface but needs to be computed. This requires the underlying neural system to calculate differences of the intensity between the two ears which provide a distinct cue for the location of a sound source. Here, adaptation to this cue allows to focus on a specific part of auditory space and thereby facilitates improved localisation abilities. Based on recent findings that suggest that the intensity difference computation is a flexible process with distinct adaptation mechanisms, we developed a neural model that computes the intensity difference to two incoming sound signals. The model comprises a novel mechanism for adaptation to sound source locations and provides a means to investigate underlying neural principles of adaptation and compare their effectivenesses. We demonstrate that due this mechanism the perceptual range is extended and a finer resolution of auditory space is obtained. Results explain the neural basis for adaptation and indicate that the interplay between different adaptation mechanisms facilitate highly precise sound source localization in a wide range of locations. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. From Near-Optimal Bayesian Integration to Neuromorphic Hardware: A Neural Network Model of Multisensory Integration.
- Author
-
Oess, Timo, Löhr, Maximilian P. R., Schmid, Daniel, Ernst, Marc O., and Neumann, Heiko
- Subjects
ARTIFICIAL neural networks ,AUDITORY neurons ,NEURAL circuitry ,PERCEPTUAL motor learning ,NERVOUS system ,VISUAL perception ,SENSES ,BRAIN stem - Abstract
While interacting with the world our senses and nervous system are constantly challenged to identify the origin and coherence of sensory input signals of various intensities. This problem becomes apparent when stimuli from different modalities need to be combined, e.g., to find out whether an auditory stimulus and a visual stimulus belong to the same object. To cope with this problem, humans and most other animal species are equipped with complex neural circuits to enable fast and reliable combination of signals from various sensory organs. This multisensory integration starts in the brain stem to facilitate unconscious reflexes and continues on ascending pathways to cortical areas for further processing. To investigate the underlying mechanisms in detail, we developed a canonical neural network model for multisensory integration that resembles neurophysiological findings. For example, the model comprises multisensory integration neurons that receive excitatory and inhibitory inputs from unimodal auditory and visual neurons, respectively, as well as feedback from cortex. Such feedback projections facilitate multisensory response enhancement and lead to the commonly observed inverse effectiveness of neural activity in multisensory neurons. Two versions of the model are implemented, a rate-based neural network model for qualitative analysis and a variant that employs spiking neurons for deployment on a neuromorphic processing. This dual approach allows to create an evaluation environment with the ability to test model performances with real world inputs. As a platform for deployment we chose IBM's neurosynaptic chip TrueNorth. Behavioral studies in humans indicate that temporal and spatial offsets as well as reliability of stimuli are critical parameters for integrating signals from different modalities. The model reproduces such behavior in experiments with different sets of stimuli. In particular, model performance for stimuli with varying spatial offset is tested. In addition, we demonstrate that due to the emergent properties of network dynamics model performance is close to optimal Bayesian inference for integration of multimodal sensory signals. Furthermore, the implementation of the model on a neuromorphic processing chip enables a complete neuromorphic processing cascade from sensory perception to multisensory integration and the evaluation of model performance for real world inputs. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
12. Illusory changes in the perceived speed of motion derived from proprioception and touch.
- Author
-
Moscatelli, Alessandro, Scotto, Cecile R., and Ernst, Marc O.
- Abstract
In vision, the perceived velocity of a moving stimulus differs depending on whether we pursue it with the eyes or not: A stimulus moving across the retina with the eyes stationary is perceived as being faster compared with a stimulus of the same physical speed that the observer pursues with the eyes, while its retinal motion is zero. This effect is known as the Aubert–Fleischl phenomenon. Here, we describe an analog phenomenon in touch. We asked participants to estimate the speed of a moving stimulus either from tactile motion only (i.e., motion across the skin), while keeping the hand world stationary, or from kinesthesia only by tracking the stimulus with a guided arm movement, such that the tactile motion on the finger was zero (i.e., only finger motion but no movement across the skin). Participants overestimated the velocity of the stimulus determined from tactile motion compared with kinesthesia in analogy with the visual Aubert–Fleischl phenomenon. In two follow-up experiments, we manipulated the stimulus noise by changing the texture of the touched surface. Similarly to the visual phenomenon, this significantly affected the strength of the illusion. This study supports the hypothesis of shared computations for motion processing between vision and touch. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
13. Goal-related feedback guides motor exploration and redundancy resolution in human motor skill acquisition.
- Author
-
Rohde, Marieke, Narioka, Kenichi, Steil, Jochen J., Klein, Lina K., and Ernst, Marc O.
- Subjects
NERVOUS system ,COGNITIVE psychology ,MACHINE learning ,COGNITIVE science ,NEUROSCIENCES - Abstract
The plasticity of the human nervous system allows us to acquire an open-ended repository of sensorimotor skills in adulthood, such as the mastery of tools, musical instruments or sports. How novel sensorimotor skills are learned from scratch is yet largely unknown. In particular, the so-called inverse mapping from goal states to motor states is underdetermined because a goal can often be achieved by many different movements (motor redundancy). How humans learn to resolve motor redundancy and by which principles they explore high-dimensional motor spaces has hardly been investigated. To study this question, we trained human participants in an unfamiliar and redundant visually-guided manual control task. We qualitatively compare the experimental results with simulation results from a population of artificial agents that learned the same task by Goal Babbling, which is an inverse-model learning approach for robotics. In Goal Babbling, goal-related feedback guides motor exploration and thereby enables robots to learn an inverse model directly from scratch, without having to learn a forward model first. In the human experiment, we tested whether different initial conditions (starting positions of the hand) influence the acquisition of motor synergies, which we identified by Principal Component Analysis in the motor space. The results show that the human participants’ solutions are spatially biased towards the different starting positions in motor space and are marked by a gradual co-learning of synergies and task success, similar to the dynamics of motor learning by Goal Babbling. However, there are also differences between human learning and the Goal Babbling simulations, as humans tend to predominantly use Degrees of Freedom that do not have a large effect on the hand position, whereas in Goal Babbling, Degrees of Freedom with a large effect on hand position are used predominantly. We conclude that humans use goal-related feedback to constrain motor exploration and resolve motor redundancy when learning a new sensorimotor mapping, but in a manner that differs from the current implementation of Goal Babbling due to different constraints on motor exploration. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
14. Kinematic cross-correlation induces sensory integration across separate objects.
- Author
-
Debats, Nienke B., Ernst, Marc O., and Heuer, Herbert
- Subjects
SENSORIMOTOR integration ,KINEMATICS ,VISUAL perception ,BAYESIAN analysis ,PERCEPTUAL-motor processes - Abstract
In a basic cursor-control task, the perceived positions of the hand and the cursor are biased towards each other. We recently found that this phenomenon conforms to the reliability-based weighting mechanism of optimal multisensory integration. This indicates that optimal integration is not restricted to sensory signals originating from a single source, as is the prevailing view, but that it also applies to separate objects that are connected by a kinematic relation (i.e. hand and cursor). In the current study, we examined which aspects of the kinematic relation are crucial for eliciting the sensory integration: (i) the cross-correlation between kinematic variables of the hand and cursor trajectories, and/or (ii) an internal model of the hand-cursor kinematic transformation. Participants made out-and-back movements from the centre of a semicircular workspace to its boundary, after which they judged the position where either their hand or the cursor hit the boundary. We analysed the position biases and found that the integration was strong in a condition with high kinematic correlations (a straight hand trajectory was mapped to a straight cursor trajectory), that it was significantly reduced for reduced kinematic correlations (a straight hand trajectory was transformed into a curved cursor trajectory) and that it was not affected by the inability to acquire an internal model of the kinematic transformation (i.e. by the trial-to-trial variability of the cursor curvature). These findings support the idea that correlations play a crucial role in multisensory integration irrespective of the number of sensory sources involved. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
15. Sensorymotor Synergies: Fusion of Cutaneous Touch and Proprioception in the Perceived Hand Kinematics.
- Author
-
Moscatelli, Alessandro, Bianchi, Matteo, Serio, Alessandro, Bicchi, Antonio, and Ernst, Marc O.
- Published
- 2016
- Full Text
- View/download PDF
16. Digit Position and Force Synergies During Unconstrained Grasping.
- Author
-
Naceri, Abdeldjallil, Santello, Marco, Moscatelli, Alessandro, and Ernst, Marc O.
- Published
- 2016
- Full Text
- View/download PDF
17. The Influence of Motor Task on Tactile Suppression During Action.
- Author
-
Debats, Nienke B., Rohde, Marieke, Glowania, Catharina, Oppenborn, Anna, and Ernst, Marc O.
- Published
- 2016
- Full Text
- View/download PDF
18. Finding Home: Landmark Ambiguity in Human Navigation.
- Author
-
Jetzschke, Simon, Ernst, Marc O., Froehlich, Julia, and Boeddeker, Norbert
- Subjects
SPATIAL arrangement ,NAVIGATION ,WALKING ,VIRTUAL reality ,MAXIMUM likelihood statistics - Abstract
Memories of places often include landmark cues, i.e., information provided by the spatial arrangement of distinct objects with respect to the target location. To study how humans combine landmark information for navigation, we conducted two experiments: To this end, participants were either provided with auditory landmarks while walking in a large sports hall or with visual landmarks while walking on a virtual-reality treadmill setup. We found that participants cannot reliably locate their home position due to ambiguities in the spatial arrangement when only one or two uniform landmarks provide cues with respect to the target. With three visual landmarks that look alike, the task is solved without ambiguity, while audio landmarks need to play three unique sounds for a similar performance. This reduction in ambiguity through integration of landmark information from 1, 2, and 3 landmarks is well modeled using a probabilistic approach based on maximum likelihood estimation. Unlike any deterministic model of human navigation (based e.g., on distance or angle information), this probabilistic model predicted both the precision and accuracy of the human homing performance. To further examine how landmark cues are integrated we introduced systematic conflicts in the visual landmark configuration between training of the home position and tests of the homing performance. The participants integrated the spatial information from each landmark near-optimally to reduce spatial variability. When the conflict becomes big, this integration breaks down and precision is sacrificed for accuracy. That is, participants return again closer to the home position, because they start ignoring the deviant third landmark. Relying on two instead of three landmarks, however, goes along with responses that are scattered over a larger area, thus leading to higher variability. To model the breakdown of integration with increasing conflict, the probabilistic model based on a simple Gaussian distribution used for Experiment 1 needed a slide extension in from of a mixture of Gaussians. All parameters for the Mixture Model were fixed based on the homing performance in the baseline condition which contained a single landmark. from the 1-Landmark Condition. This way we found that the Mixture Model could predict the integration performance and its breakdown with no additional free parameters. Overall these data suggest that humans use similar optimal probabilistic strategies in visual and auditory navigation, integrating landmark information to improve homing precision and balance homing precision with homing accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
19. Noise, multisensory integration, and previous response in perceptual disambiguation.
- Author
-
Parise, Cesare V. and Ernst, Marc O.
- Subjects
COMPUTATIONAL biology ,NEUROSCIENCES ,DECISION making ,MOTION perception (Vision) ,VISUAL perception - Abstract
Sensory information about the state of the world is generally ambiguous. Understanding how the nervous system resolves such ambiguities to infer the actual state of the world is a central quest for sensory neuroscience. However, the computational principles of perceptual disambiguation are still poorly understood: What drives perceptual decision-making between multiple equally valid solutions? Here we investigate how humans gather and combine sensory information–within and across modalities–to disambiguate motion perception in an ambiguous audiovisual display, where two moving stimuli could appear as either streaming through, or bouncing off each other. By combining psychophysical classification tasks with reverse correlation analyses, we identified the particular spatiotemporal stimulus patterns that elicit a stream or a bounce percept, respectively. From that, we developed and tested a computational model for uni- and multi-sensory perceptual disambiguation that tightly replicates human performance. Specifically, disambiguation relies on knowledge of prototypical bouncing events that contain characteristic patterns of motion energy in the dynamic visual display. Next, the visual information is linearly integrated with auditory cues and prior knowledge about the history of recent perceptual interpretations. What is more, we demonstrate that perceptual decision-making with ambiguous displays is systematically driven by noise, whose random patterns not only promote alternation, but also provide signal-like information that biases perception in highly predictable fashion. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
20. Modulation frequency as a cue for auditory speed perception.
- Author
-
Senna, Irene, Parise, Cesare V., and Ernst, Marc O.
- Subjects
MOTION perception (Vision) ,VISUAL perception ,SENSORY perception ,AUDITORY adaptation ,PHYSIOLOGICAL adaptation - Abstract
Unlike vision, the mechanisms underlying auditory motion perception are poorly understood. Here we describe an auditory motion illusion revealing a novel cue to auditory speed perception: the temporal frequency of amplitude modulation (AM-frequency), typical for rattling sounds. Naturally, corrugated objects sliding across each other generate rattling sounds whose AM-frequency tends to directly correlate with speed. We found that AM-frequency modulates auditory speed perception in a highly systematic fashion: moving sounds with higher AM-frequency are perceived as moving faster than sounds with lower AM-frequency. Even more interestingly, sounds with higher AM-frequency also induce stronger motion aftereffects. This reveals the existence of specialized neural mechanisms for auditory motion perception, which are sensitive to AM-frequency. Thus, in spatial hearing, the brain successfully capitalizes on the AM-frequency of rattling sounds to estimate the speed of moving objects. This tightly parallels previous findings in motion vision, where spatio-temporal frequency of moving displays systematically affects both speed perception and the magnitude of the motion aftereffects. Such an analogy with vision suggests that motion detection may rely on canonical computations, with similar neural mechanisms shared across the different modalities. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
21. Multidigit force control during unconstrained grasping in response to object perturbations.
- Author
-
Naceri, Abdeldjallil, Moscatelli, Alessandro, Haschke, Robert, Ritter, Helge, Santello, Marco, and Ernst, Marc O.
- Subjects
PREHENSION (Physiology) ,ROBOT hands ,GRIP strength ,NEUROPHYSIOLOGY ,ALGORITHMS - Abstract
Because of the complex anatomy of the human hand, in the absence of external constraints, a large number of postures and force combinations can be used to attain a stable grasp. Motor synergies provide a viable strategy to solve this problem of motor redundancy. In this study, we exploited the technical advantages of an innovative sensorized object to study unconstrained hand grasping within the theoretical framework of motor synergies. Participants were required to grasp, lift, and hold the sensorized object. During the holding phase, we repetitively applied external disturbance forces and torques and recorded the spatiotemporal distribution of grip forces produced by each digit. We found that the time to reach the maximum grip force during each perturbation was roughly equal across fingers, consistent with a synchronous, synergistic stiffening across digits. We further evaluated this hypothesis by comparing the force distribution of human grasping vs. robotic grasping, where the control strategy was set by the experimenter. We controlled the global hand stiffness of the robotic hand and found that this control algorithm produced a force pattern qualitatively similar to human grasping performance. Our results suggest that the nervous system uses a default whole hand synergistic control to maintain a stable grasp regardless of the number of digits involved in the task, their position on the objects, and the type and frequency of external perturbations. NEW & NOTEWORTHY: We studied hand grasping using a sensorized object allowing unconstrained finger placement. During object perturbation, the time to reach the peak force was roughly equal across fingers, consistently with a synergistic stiffening across fingers. Force distribution of a robotic grasping hand, where the control algorithm is based on global hand stiffness, was qualitatively similar to human grasping. This suggests that the central nervous system uses a default whole hand synergistic control to maintain a stable grasp. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
22. Perceptual attraction in tool use: evidence for a reliability-based weighting mechanism.
- Author
-
Debats, Nienke B., Ernst, Marc O., and Heuer, Herbert
- Subjects
GEOGRAPHICAL perception ,ATTRACTION (Physics) ,RELIABILITY in engineering ,TESTING ,MECHANISM (Philosophy) - Abstract
Humans are well able to operate tools whereby their hand movement is linked, via a kinematic transformation, to a spatially distant object moving in a separate plane of motion. An everyday example is controlling a cursor on a computer monitor. Despite these separate reference frames, the perceived positions of the hand and the object were found to be biased toward each other. We propose that this perceptual attraction is based on the principles by which the brain integrates redundant sensory information of single objects or events, known as optimal multisensory integration. That is, 1) sensory information about the hand and the tool are weighted according to their relative reliability (i.e., inverse variances), and 2) the unisensory reliabilities sum up in the integrated estimate. We assessed whether perceptual attraction is consistent with optimal multisensory integration model predictions. We used a cursor-control tool-use task in which we manipulated the relative reliability of the unisensory hand and cursor position estimates. The perceptual biases shifted according to these relative reliabilities, with an additional bias due to contextual factors that were present in experiment 1 but not in experiment 2. The biased position judgments’ variances were, however, systematically larger than the predicted optimal variances. Our findings suggest that the perceptual attraction in tool use results from a reliability-based weighting mechanism similar to optimal multisensory integration, but that certain boundary conditions for optimality might not be satisfied. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
23. Statistically Optimal Multisensory Cue Integration: A Practical Tutorial.
- Author
-
Rohde, Marieke, van Dam, Loes C. J., and Ernst, Marc O.
- Subjects
PERCEPTUAL motor learning ,SENSORY perception ,JUDGMENT (Psychology) ,COGNITIVE ability ,PSYCHOPHYSICS ,BAYESIAN analysis - Abstract
Humans combine redundant multisensory estimates into a coherent multimodal percept. Experiments in cue integration have shown for many modality pairs and perceptual tasks that multisensory information is fused in a statistically optimal manner: observers take the unimodal sensory reliability into consideration when performing perceptual judgments. They combine the senses according to the rules of Maximum Likelihood Estimation to maximize overall perceptual precision. This tutorial explains in an accessible manner how to design optimal cue integration experiments and how to analyse the results from these experiments to test whether humans follow the predictions of the optimal cue integration model. The tutorial is meant for novices in multisensory integration and requires very little training in formal models and psychophysical methods. For each step in the experimental design and analysis, rules of thumb and practical examples are provided. We also publish Matlab code for an example experiment on cue integration and a Matlab toolbox for data analysis that accompanies the tutorial online. This way, readers can learn about the techniques by trying them out themselves. We hope to provide readers with the tools necessary to design their own experiments on optimal cue integration and enable them to take part in explaining when, why and how humans combine multisensory information optimally. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
24. The role of vibration in tactile speed perception.
- Author
-
Dallmann, Chris J., Ernst, Marc O., and Moscatelli, Alessandro
- Subjects
SENSORY perception ,SKIN diseases ,PSYCHOPHYSICS ,VIBROTACTILE stimulation ,AVERSIVE stimuli - Abstract
The relative motion between the surface of an object and our fingers produces patterns of skin deformation such as stretch, indentation, and vibrations. In this study, we hypothesized that motion-induced vibrations are combined with other tactile cues for the discrimination of tactile speed. Specifically, we hypothesized that vibrations provide a critical cue to tactile speed on surfaces lacking individually detectable features like dots or ridges. Thus masking vibrations unrelated to slip motion should impair the discriminability of tactile speed, and the effect should be surface-dependent. To test this hypothesis, we measured the precision of participants in discriminating the speed of moving surfaces having either a fine or a ridged texture, while adding masking vibratory noise in the working range of the fast-adapting mechanoreceptive afferents. Vibratory noise significantly reduced the precision of speed discrimination, and the effect was much stronger on the fine-textured than on the ridged surface. On both surfaces, masking vibrations at intermediate frequencies of 64 Hz (65-μm peak-topeak amplitude) and 128 Hz (10 μm) had the strongest effect, followed by high-frequency vibrations of 256 Hz (1 μm) and lowfrequency vibrations of 32 Hz (50 and 25 μm). These results are consistent with our hypothesis that slip-induced vibrations concur to the discrimination of tactile speed. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
25. Coordination of multi-digit positions and forces during unconstrained grasping in response to object perturbations.
- Author
-
Naceri, Abdeldjallil, Moscatelli, Alessandro, Santello, Marco, and Ernst, Marc O.
- Published
- 2014
- Full Text
- View/download PDF
26. Multi-digit Position and Force Coordination in Three- and Four-Digit Grasping.
- Author
-
Naceri, Abdeldjallil, Moscatelli, Alessandro, Santello, Marco, and Ernst, Marc O.
- Published
- 2014
- Full Text
- View/download PDF
27. Computational Aspects of Softness Perception.
- Author
-
Di Luca, Massimiliano and Ernst, Marc O.
- Published
- 2014
- Full Text
- View/download PDF
28. Mapping Shape to Visuomotor Mapping: Learning and Generalisation of Sensorimotor Behaviour Based on Contextual Information.
- Author
-
van Dam, Loes C. J. and Ernst, Marc O.
- Subjects
VISUOMOTOR coordination ,PSYCHOLOGICAL feedback ,ASSOCIATIVE learning ,BAYESIAN analysis ,MATHEMATICAL mappings - Abstract
Humans can learn and store multiple visuomotor mappings (dual-adaptation) when feedback for each is provided alternately. Moreover, learned context cues associated with each mapping can be used to switch between the stored mappings. However, little is known about the associative learning between cue and required visuomotor mapping, and how learning generalises to novel but similar conditions. To investigate these questions, participants performed a rapid target-pointing task while we manipulated the offset between visual feedback and movement end-points. The visual feedback was presented with horizontal offsets of different amounts, dependent on the targets shape. Participants thus needed to use different visuomotor mappings between target location and required motor response depending on the target shape in order to “hit” it. The target shapes were taken from a continuous set of shapes, morphed between spiky and circular shapes. After training we tested participants performance, without feedback, on different target shapes that had not been learned previously. We compared two hypotheses. First, we hypothesised that participants could (explicitly) extract the linear relationship between target shape and visuomotor mapping and generalise accordingly. Second, using previous findings of visuomotor learning, we developed a (implicit) Bayesian learning model that predicts generalisation that is more consistent with categorisation (i.e. use one mapping or the other). The experimental results show that, although learning the associations requires explicit awareness of the cues’ role, participants apply the mapping corresponding to the trained shape that is most similar to the current one, consistent with the Bayesian learning model. Furthermore, the Bayesian learning model predicts that learning should slow down with increased numbers of training pairs, which was confirmed by the present results. In short, we found a good correspondence between the Bayesian learning model and the empirical results indicating that this model poses a possible mechanism for simultaneously learning multiple visuomotor mappings. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
29. Enabling Unconstrained Omnidirectional Walking Through Virtual Environments: An Overview of the CyberWalk Project.
- Author
-
Frissen, Ilja, Campos, Jennifer L., Sreenivasa, Manish, and Ernst, Marc O.
- Published
- 2013
- Full Text
- View/download PDF
30. Introducing the shape-length illusion.
- Author
-
Plaisier, Myrthe A. and Ernst, Marc O.
- Published
- 2013
- Full Text
- View/download PDF
31. Two Hands Perceive Better Than One.
- Author
-
Plaisier, Myrthe A. and Ernst, Marc O.
- Published
- 2012
- Full Text
- View/download PDF
32. Active Movement Reduces the Tactile Discrimination Performance.
- Author
-
Vitello, Marco P., Fritschi, Michael, and Ernst, Marc O.
- Published
- 2012
- Full Text
- View/download PDF
33. Multimodale Objektwahrnehmung.
- Author
-
Ernst, Marc O. and Rohde, Marieke
- Published
- 2012
- Full Text
- View/download PDF
34. Human Haptic Perception and the Design of Haptic-Enhanced Virtual Environments.
- Author
-
Bresciani, Jean-Pierre, Drewing, Knut, and Ernst, Marc O.
- Abstract
This chapter presents an overview of interesting scientific findings related to human haptic perception and discuss the usability of these scientific findings for the design and development of virtual environments including haptic rendering. The first section of the chapter deals with pure haptic perception whereas the second and third sections are devoted to the integration of kinesthetic information with other sensory inputs like vision and audition. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
35. Multi-modal VR Systems.
- Author
-
Fritschi, Michael, Esen, Hasan, Buss, Martin, and Ernst, Marc O.
- Abstract
This chapter presents novel multi-modal and integrated systems developed in the laboratories of the Institute of Automatic Control Engineering, Technische Universität München. First, kinesthetic, tactile, visual and acoustic hardware used for multi-modal systems are introduced individually. Then the integration of the hardware into multi-modal VR systems and chosen applications are explained. The kinesthetic-tactile integrated systems are evaluated. The objective of the evaluations has been the study of the psychophysical correlation between the tactile and the kinesthetic portion of haptic information. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
36. Haptic perception in interaction with other senses.
- Author
-
Helbig, Hannah B. and Ernst, Marc O.
- Abstract
Human perception is inherently multisensory: we perceive the world simultaneously with multiple senses. While strolling the farmers market, for example, we might become aware of the presence of a delicious fruit by its characteristic smell. We might use our senses of vision and touch to identify the fruit by its typical size and shape and touch it to select only that one with the distinctive soft texture that signals `ripe΄. When we take a bite of the fruit, we taste its characteristic flavour and hear a slight smacking sound which confirms that the fruit we perceive with our senses of vision, touch, audition, smell and taste is a ripe, delicious peach. That is, in the natural environment the information delivered by our sense of touch is combined with information gathered by each of the other senses to create a robust percept. Combining information from multiple systems is essential because no information-processing system, neither technical nor biological, is powerful enough to provide a precise and accurate sensory estimate under all conditions. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
37. Natural auditory scene statistics shapes human spatial hearing.
- Author
-
Parise, Cesare V., Knorre, Katharina, and Ernst, Marc O.
- Subjects
AUDITORY scene analysis ,SENSORY perception ,SPATIAL ability ,HEARING ,EAR testing - Abstract
Human perception, cognition, and action are laced with seemingly arbitrary mappings. In particular, sound has a strong spatial connotation: Sounds are high and low, melodies rise and fall, and pitch systematically biases perceived sound elevation. The origins of such mappings are unknown. Are they the result of physiological constraints, do they reflect natural environmental statistics, or are they truly arbitrary? We recorded natural sounds from the environment, analyzed the elevation-dependent filtering of the outer ear, and measured frequency-dependent biases in human sound localization. We find that auditory scene statistics reveals a clear mapping between frequency and elevation. Perhaps more interestingly, this natural statistical mapping is tightly mirrored in both ear-filtering properties and in perceived sound location. This suggests that both sound localization behavior and ear anatomy are fine-tuned to the statistics of natural auditory scenes, likely providing the basis for the spatial connotation of human hearing. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
38. The Duration of Uncertain Times: Audiovisual Information about Intervals Is Integrated in a Statistically Optimal Fashion.
- Author
-
Hartcher-O'Brien, Jess, Di Luca, Massimiliano, and Ernst, Marc O.
- Subjects
AUDIOVISUAL materials ,INFORMATION resources ,HEALTH policy ,MEDICAL statistics ,MENTAL health ,SENSORY perception - Abstract
Often multisensory information is integrated in a statistically optimal fashion where each sensory source is weighted according to its precision. This integration scheme isstatistically optimal because it theoretically results in unbiased perceptual estimates with the highest precisionpossible.There is a current lack of consensus about how the nervous system processes multiple sensory cues to elapsed time.In order to shed light upon this, we adopt a computational approach to pinpoint the integration strategy underlying duration estimationof audio/visual stimuli. One of the assumptions of our computational approach is that the multisensory signals redundantly specify the same stimulus property. Our results clearly show that despite claims to the contrary, perceived duration is the result of an optimal weighting process, similar to that adopted for estimates of space. That is, participants weight the audio and visual information to arrive at the most precise, single duration estimate possible. The work also disentangles how different integration strategies – i.e. consideringthe time of onset/offset ofsignals - might alter the final estimate. As such we provide the first concrete evidence of an optimal integration strategy in human duration estimates. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
39. Knowing Each Random Error of Our Ways, but Hardly Correcting for It: An Instance of Optimal Performance.
- Author
-
van Dam, Loes C. J. and Ernst, Marc O.
- Subjects
SENSORIMOTOR cortex ,FALLIBILITY ,ERRORS & omissions insurance ,NEUROSCIENCES ,COMPUTATIONAL neuroscience ,COGNITIVE science - Abstract
Random errors are omnipresent in sensorimotor tasks due to perceptual and motor noise. The question is, are humans aware of their random errors on an instance-by-instance basis? The appealing answer would be ‘no’ because it seems intuitive that humans would otherwise immediately correct for the errors online, thereby increasing sensorimotor precision. However, here we show the opposite. Participants pointed to visual targets with varying degree of feedback. After movement completion participants indicated whether they believed they landed left or right of target. Surprisingly, participants' left/right-discriminability was well above chance, even without visual feedback. Only when forced to correct for the error after movement completion did participants loose knowledge about the remaining error, indicating that random errors can only be accessed offline. When correcting, participants applied the optimal correction gain, a weighting factor between perceptual and motor noise, minimizing end-point variance. Together these results show that humans optimally combine direct information about sensorimotor noise in the system (the current random error), with indirect knowledge about the variance of the perceptual and motor noise distributions. Yet, they only appear to do so offline after movement completion, not while the movement is still in progress, suggesting that during movement proprioceptive information is less precise. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
40. Touching Curvature and Feeling Size: a Contrast Illusion.
- Author
-
Plaisier, Myrthe A. and Ernst, Marc O.
- Subjects
TOUCH ,PROPRIOCEPTION ,FINGERS ,CURVATURE - Abstract
We know that our eyes can be deceiving. Here we demonstrate that we should not always trust our sense of touch either. Previous studies have shown that when pinching an object between thumb and index finger, we can under many circumstances accurately perceive its size. In contrast, the current results show that the local curvature at the areas of contact between the object and the fingers causes systematic under- or overestimation of the object's size. This is rather surprising given that local curvature is not directly related to the object's size. We suggest an explanation in terms of a contrast between the finger separation and an inferred relationship between local curvature and size. This study provides the first demonstration of an illusory haptic size percept caused by local curvature in a pinch grip. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
41. Cross-correlation between Auditory and Visual Signals Promotes Multisensory Integration.
- Author
-
Parise, Cesare V., Harrar, Vanessa, Ernst, Marc O., and Spence, Charles
- Subjects
AUDITORY perception ,CROSS correlation ,VISUAL perception ,SENSORY evaluation ,PROBLEM solving - Abstract
Humans are equipped with multiple sensory channels that provide both redundant and complementary information about the objects and events in the world around them. A primary challenge for the brain is therefore to solve the 'correspondence problem', that is, to bind those signals that likely originate from the same environmental source, while keeping separate those unisensory inputs that likely belong to different objects/events.Whether multiple signals have a common origin or not must, however, be inferred from the signals themselves through a causal inference process. Recent studies have demonstrated that cross-correlation, that is, the similarity in temporal structure between unimodal signals, represents a powerful cue for solving the correspondence problem in humans. Here we provide further evidence for the role of the temporal correlation between auditory and visual signals in multisensory integration. Capitalizing on the well-known fact that sensitivity to crossmodal conflict is inversely related to the strength of coupling between the signals, we measured sensitivity to crossmodal spatial conflicts as a function of the cross-correlation between the temporal structures of the audiovisual signals. Observers' performance was systematically modulated by the cross-correlation, with lower sensitivity to crossmodal conflict being measured for correlated as compared to uncorrelated audiovisual signals. These results therefore provide support for the claim that cross-correlation promotes multisensory integration. A Bayesian framework is proposed to interpret the present results, whereby stimulus correlation is represented on the prior distribution of expected crossmodal co-occurrence. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
42. Changes of Hand Switching Costs during Bimanual Sequential Learning.
- Author
-
Trapp, Sabrina, Lepsien, Jóran, Sehm, Bernhard, Villringer, Arno, Ragert, Patrick, and Ernst, Marc O.
- Subjects
TASK performance ,REACTION time ,FINGERS ,HAND ,MOVEMENT sequences ,EYE-hand coordination ,MOTOR ability - Abstract
Many tasks in our daily life demand not only the use of different fingers of one hand in a serial fashion, but also to alternate from one hand to the other. Here, we investigated performance in a bimanual serial reaction time task (SRTT) with particular emphasis on learning-related changes in reaction time (RT) for consecutive button presses for homologous index- and middle fingers. The bimanual SRTT consisted of sequential button presses either with the left or right index- and middle-finger to a series of visual letters displayed on a computer screen. Each letter was assigned a specific button press with one of four fingers. Two outcome measures were investigated: (a) global sequence learning as defined by the time needed to complete a 15-letter SRTT sequence and (b) changes in hand switch costs across learning. We found that bimanual SRTT resulted in a global decrease in RT during the time course of learning that persisted for at least two weeks. Furthermore, RT to a button press showed an increase when the previous button press was associated with another hand as opposed to the same hand. This increase in RT was defined as switch costs. Hand switch costs significantly decreased during the time course of learning, and remained stable over a time of approximately two weeks. This study provides evidence for modulations of switch costs during bimanual sequence learning, a finding that might have important implications for theories of bimanual coordination and learning. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
43. Tactile Motion Adaptation Reduces Perceived Speed but Shows No Evidence of Direction Sensitivity.
- Author
-
McIntyre, Sarah, Holcombe, Alex O., Birznieks, Ingvars, Seizova-Cajic, Tatjana, and Ernst, Marc O.
- Subjects
TOUCH ,MOTION ,SPEED ,HAND ,STIMULUS & response (Biology) ,PHYSIOLOGICAL adaptation - Abstract
Introduction: While the directionality of tactile motion processing has been studied extensively, tactile speed processing and its relationship to direction is little-researched and poorly understood. We investigated this relationship in humans using the 'tactile speed aftereffect' (tSAE), in which the speed of motion appears slower following prolonged exposure to a moving surface. Method: We used psychophysical methods to test whether the tSAE is direction sensitive. After adapting to a ridged moving surface with one hand, participants compared the speed of test stimuli on the adapted and unadapted hands. We varied the direction of the adapting stimulus relative to the test stimulus. Results: Perceived speed of the surface moving at 81 mms
-1 was reduced by about 30% regardless of the direction of the adapting stimulus (when adapted in the same direction, Mean reduction = 23 mms-1 , SD = 11; with opposite direction, Mean reduction = 26 mms-1 , SD = 9). In addition to a large reduction in perceived speed due to adaptation, we also report that this effect is not direction sensitive. Conclusions: Tactile motion is susceptible to speed adaptation. This result complements previous reports of reliable direction aftereffects when using a dynamic test stimulus as together they describe how perception of a moving stimulus in touch depends on the immediate history of stimulation. Given that the tSAE is not direction sensitive, we argue that peripheral adaptation does not explain it, because primary afferents are direction sensitive with friction-creating stimuli like ours (thus motion in their preferred direction should result in greater adaptation, and if perceived speed were critically dependent on these afferents' response intensity, the tSAE should be direction sensitive). The adaptation that reduces perceived speed therefore seems to be of central origin. [ABSTRACT FROM AUTHOR]- Published
- 2012
- Full Text
- View/download PDF
44. Objective Fidelity Evaluation in Multisensory Virtual Environments: Auditory Cue Fidelity in Flight Simulation.
- Author
-
Meyer, Georg F., Li Ting Wong, Timson, Emma, Perfect, Philip, White, Mark D., and Ernst, Marc O.
- Subjects
VIRTUAL reality ,KINEMATICS ,SIMULATION methods & models ,CASE studies ,PROMPTS (Psychology) ,TASK analysis - Abstract
We argue that objective fidelity evaluation of virtual environments, such as flight simulation, should be human-performance-centred and task-specific rather than measure the match between simulation and physical reality. We show how principled experimental paradigms and behavioural models to quantify human performance in simulated environments that have emerged from research in multisensory perception provide a framework for the objective evaluation of the contribution of individual cues to human performance measures of fidelity. We present three examples in a flight simulation environment as a case study: Experiment 1: Detection and categorisation of auditory and kinematic motion cues; Experiment 2: Performance evaluation in a target-tracking task; Experiment 3: Transferrable learning of auditory motion cues. We show how the contribution of individual cues to human performance can be robustly evaluated for each task and that the contribution is highly task dependent. The same auditory cues that can be discriminated and are optimally integrated in experiment 1, do not contribute to target-tracking performance in an in-flight refuelling simulation without training, experiment 2. In experiment 3, however, we demonstrate that the auditory cue leads to significant, transferrable, performance improvements with training. We conclude that objective fidelity evaluation requires a task-specific analysis of the contribution of individual cues. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
45. Is "Circling" Behavior in Humans Related to Postural Asymmetry?
- Author
-
Bestaven, Emma, Guillaud, Etienne, Cazalets, Jean-René, and Ernst, Marc O.
- Subjects
HUMAN behavior research ,POSTURE ,WALKING ,BIOMECHANICS research ,SYMMETRY (Biology) ,HUMAN attitude & movement - Abstract
In attempting to walk rectilinearly in the absence of visual landmarks, persons will gradually turn in a circle to eventually become lost. The aim of the present study was to provide insights into the possible underlying mechanisms of this behavior. For each subject (N = 15) six trajectories were monitored during blindfolded walking in a large enclosed area to suppress external cues, and ground irregularities that may elicit unexpected changes in direction. There was a substantial variability from trial to trial for a given subject and between subjects who could either veer very early or relatively late. Of the total number of trials, 50% trajectories terminated on the left side, 39% on the right side and 11% were defined as "straight". For each subject, we established a "turning score" that reflected his/her preferential side of veering. The turning score was found to be unrelated to any evident biomechanical asymmetry or functional dominance (eye, hand…). Posturographic analysis, used to assess if there was a relationship between functional postural asymmetry and veering revealed that the mean position of the center of foot pressure during balance tests was correlated with the turning score. Finally, we established that the mean position of the center of pressure was correlated with perceived verticality assessed by a subjective verticality test. Together, our results suggest that veering is related to a "sense of straight ahead" that could be shaped by vestibular inputs. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
46. The Effects of Rhythmic Sensory Cues on the Temporal Dynamics of Human Gait.
- Author
-
Sejdić, Ervin, Yingying Fu, Pak, Alison, Fairley, Jillian A., Chau, Tom, and Ernst, Marc O.
- Subjects
AUDITORY perception ,GAIT in humans ,WALKING ,MUSCULOSKELETAL system ,REHABILITATION ,RHYTHM - Abstract
Walking is a complex, rhythmic task performed by the locomotor system. However, natural gait rhythms can be influenced by metronomic auditory stimuli, a phenomenon of particular interest in neurological rehabilitation. In this paper, we examined the effects of aural, visual and tactile rhythmic cues on the temporal dynamics associated with human gait. Data were collected from fifteen healthy adults in two sessions. Each session consisted of five 15-minute trials. In the first trial of each session, participants walked at their preferred walking speed. In subsequent trials, participants were asked to walk to a metronomic beat, provided through visually, aurally, tactile or all three cues (simultaneously and in sync), the pace of which was set to the preferred walking speed of the first trial. Using the collected data, we extracted several parameters including: gait speed, mean stride interval, stride interval variability, scaling exponent and maximum Lyapunov exponent. The extracted parameters showed that rhythmic sensory cues affect the temporal dynamics of human gait. The auditory rhythmic cue had the greatest influence on the gait parameters, while the visual cue had no statistically significant effect on the scaling exponent. These results demonstrate that visual rhythmic cues could be considered as an alternative cueing modality in rehabilitation without concern of adversely altering the statistical persistence of walking. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
47. The Rubber Hand Illusion: Feeling of Ownership and Proprioceptive Drift Do Not Go Hand in Hand.
- Author
-
Rohde, Marieke, Luca, Massimiliano Di, and Ernst, Marc O.
- Subjects
ILLUSION (Philosophy) ,PERCEPTUAL motor learning ,ROBOTS ,MIRRORS ,PROPRIOCEPTION ,PHYSICS experiments ,QUESTIONNAIRES ,FREQUENCIES of oscillating systems - Abstract
In the Rubber Hand Illusion, the feeling of ownership of a rubber hand displaced from a participant's real occluded hand is evoked by synchronously stroking both hands with paintbrushes. A change of perceived finger location towards the rubber hand (proprioceptive drift) has been reported to correlate with this illusion. To measure the time course of proprioceptive drift during the Rubber Hand Illusion, we regularly interrupted stroking (performed by robot arms) to measure perceived finger location. Measurements were made by projecting a probe dot into the field of view (using a semi-transparent mirror) and asking participants if the dot is to the left or to the right of their invisible hand (Experiment 1) or to adjust the position of the dot to that of their invisible hand (Experiment 2). We varied both the measurement frequency (every 10 s, 40 s, 120 s) and the mode of stroking (synchronous, asynchronous, just vision). Surprisingly, with frequent measurements, proprioceptive drift occurs not only in the synchronous stroking condition but also in the two control conditions (asynchronous stroking, just vision). Proprioceptive drift in the synchronous stroking condition is never higher than in the just vision condition. Only continuous exposure to asynchronous stroking prevents proprioceptive drift and thus replicates the differences in drift reported in the literature. By contrast, complementary subjective ratings (questionnaire) show that the feeling of ownership requires synchronous stroking and is not present in the asynchronous stroking condition. Thus, subjective ratings and drift are dissociated. We conclude that different mechanisms of multisensory integration are responsible for proprioceptive drift and the feeling of ownership. Proprioceptive drift relies on visuoproprioceptive integration alone, a process that is inhibited by asynchronous stroking, the most common control condition in Rubber Hand Illusion experiments. This dissociation implies that conclusions about feelings of ownership cannot be drawn from measuring proprioceptive drift alone. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
48. Tactile suppression of displacement.
- Author
-
Ziat, Mounia, Hayward, Vincent, Chapman, C. Elaine, Ernst, Marc O., and Lenay, Charles
- Subjects
TOUCH ,BRAILLE ,WRITTEN communication ,TACTILE adaptation ,VISUAL perception - Abstract
In vision, the discovery of the phenomenon of saccadic suppression of displacement has made important contributions to the understanding of the stable world problem. Here, we report a similar phenomenon in the tactile modality. When scanning a single Braille dot with two fingers of the same hand, participants were asked to decide whether the dot was stationary or whether it was displaced from one location to another. The stimulus was produced by refreshable Braille devices that have dots that can be swiftly raised and recessed. In some conditions, the dot was stationary. In others, a displacement was created by monitoring the participant's finger position and by switching the dot activation when it was not touched by either finger. The dot displacement was of either 2.5 mm or 5 mm. We found that in certain cases, displaced dots were felt to be stationary. If the displacement was orthogonal to the finger movements, tactile suppression occurred effectively when it was of 2.5 mm, but when the displacement was of 5 mm, the participants easily detected it. If the displacement was medial-lateral, the suppression effect occurred as well, but less often when the apparent movement of the dot opposed the movement of the finger. In such cases, the stimulus appeared sooner than when the brain could predict it from finger movement, supporting a predictive rather than a postdictive differential processing hypothesis. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
49. Within- and Cross-Modal Distance Information Disambiguate Visual Size-Change Perception.
- Author
-
Battaglia, Peter W., Di Luca, Massimiliano, Ernst, Marc O., Schrater, Paul R., Machulla, Tonja, and Kersten, Daniel
- Subjects
VISUAL perception ,SIZE perception ,SENSES ,PRIOR learning ,COGNITION - Abstract
Perception is fundamentally underconstrained because different combinations of object properties can generate the same sensory information. To disambiguate sensory information into estimates of scene properties, our brains incorporate prior knowledge and additional ''auxiliary'' (i.e., not directly relevant to desired scene property) sensory information to constrain perceptual interpretations. For example, knowing the distance to an object helps in perceiving its size. The literature contains few demonstrations of the use of prior knowledge and auxiliary information in combined visual and haptic disambiguation and almost no examination of haptic disambiguation of vision beyond ''bistable'' stimuli. Previous studies have reported humans integrate multiple unambiguous sensations to perceive single, continuous object properties, like size or position. Here we test whether humans use visual and haptic information, individually and jointly, to disambiguate size from distance. We presented participants with a ball moving in depth with a changing diameter. Because no unambiguous distance information is available under monocular viewing, participants rely on prior assumptions about the ball's distance to disambiguate their -size percept. Presenting auxiliary binocular and/or haptic distance information augments participants' prior distance assumptions and improves their size judgment accuracy—though binocular cues were trusted more than haptic. Our results suggest both visual and haptic distance information disambiguate size perception, and we interpret these results in the context of probabilistic perceptual reasoning. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
50. Visually Guided Haptic Search.
- Author
-
Plaisier, Myrthe A., Kappers, Astrid M. L., Tiest, Wouter M. Bergmann, and Ernst, Marc O.
- Published
- 2010
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.