38 results on '"Murray, Micah M."'
Search Results
2. Neonatal Multisensory Processing in Preterm and Term Infants Predicts Sensory Reactivity and Internalizing Tendencies in Early Childhood
- Author
-
Maitre, Nathalie L., Key, Alexandra P., Slaughter, James C., Yoder, Paul J., Neel, Mary Lauren, Richard, Céline, Wallace, Mark T., and Murray, Micah M.
- Published
- 2020
- Full Text
- View/download PDF
3. Randomized controlled trial protocol to improve multisensory neural processing, language and motor outcomes in preterm infants
- Author
-
Neel, Mary Lauren, Yoder, Paul, Matusz, Pawel J., Murray, Micah M., Miller, Ashley, Burkhardt, Stephanie, Emery, Lelia, Hague, Kaleigh, Pennington, Caitlin, Purnell, Jessica, Lightfoot, Megan, and Maitre, Nathalie L.
- Published
- 2019
- Full Text
- View/download PDF
4. The COGs (context, object, and goals) in multisensory processing
- Author
-
ten Oever, Sanne, Romei, Vincenzo, van Atteveldt, Nienke, Soto-Faraco, Salvador, Murray, Micah M., and Matusz, Pawel J.
- Published
- 2016
- Full Text
- View/download PDF
5. How single-trial electrical neuroimaging contributes to multisensory research
- Author
-
Gonzalez Andino, Sara L., Murray, Micah M., Foxe, John J., and Menendez, Rolando Grave de Peralta
- Published
- 2005
- Full Text
- View/download PDF
6. Auditory Enhancement of Illusory Contour Perception.
- Author
-
Tivadar, Ruxandra I., Gaglianese, Anna, and Murray, Micah M.
- Subjects
SOUNDS ,NOISE ,ELECTROPHYSIOLOGY ,KNOWLEDGE gap theory ,SENSORY perception - Abstract
Illusory contours (ICs) are borders that are perceived in the absence of contrast gradients. Until recently, IC processes were considered exclusively visual in nature and presumed to be unaffected by information from other senses. Electrophysiological data in humans indicates that sounds can enhance IC processes. Despite cross-modal enhancement being observed at the neurophysiological level, to date there is no evidence of direct amplification of behavioural performance in IC processing by sounds. We addressed this knowledge gap. Healthy adults (n = 15) discriminated instances when inducers were arranged to form an IC from instances when no IC was formed (NC). Inducers were low-constrast and masked, and there was continuous background acoustic noise throughout a block of trials. On half of the trials, i.e., independently of IC vs NC, a 1000-Hz tone was presented synchronously with the inducer stimuli. Sound presence improved the accuracy of indicating when an IC was presented, but had no impact on performance with NC stimuli (significant IC presence/absence × Sound presence/absence interaction). There was no evidence that this was due to general alerting or to a speed–accuracy trade-off (no main effect of sound presence on accuracy rates and no comparable significant interaction on reaction times). Moreover, sound presence increased sensitivity and reduced bias on the IC vs NC discrimination task. These results demonstrate that multisensory processes augment mid-level visual functions, exemplified by IC processes. Aside from the impact on neurobiological and computational models of vision, our findings may prove clinically beneficial for low-vision or sight-restored patients. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
7. Mental Rotation of Digitally-Rendered Haptic Objects.
- Author
-
Tivadar, Ruxandra I., Rouillard, Tom, Chappaz, Cédrick, Knebel, Jean-François, Turoman, Nora, Anaflous, Fatima, Roche, Jean, Matusz, Pawel J., and Murray, Micah M.
- Subjects
VISION disorders ,BLINDNESS ,MENTAL rotation ,TOUCH ,SNOEZELEN - Abstract
Sensory substitution is an effective means to rehabilitate many visual functions after visual impairment or blindness. Tactile information, for example, is particularly useful for functions such as reading, mental rotation, shape recognition, or exploration of space. Extant haptic technologies typically rely on real physical objects or pneumatically driven renderings and thus provide a limited library of stimuli to users. New developments in digital haptic technologies now make it possible to actively simulate an unprecedented range of tactile sensations. We provide a proof-of-concept for a new type of technology (hereafter haptic tablet) that renders haptic feedback by modulating the friction of a flat screen through ultrasonic vibrations of varying shapes to create the sensation of texture when the screen is actively explored. We reasoned that participants should be able to create mental representations of letters presented in normal and mirror-reversed haptic form without the use of any visual information and to manipulate such representations in a mental rotation task. Healthy sighted, blindfolded volunteers were trained to discriminate between two letters (either L and P, or F and G; counterbalanced across participants) on a haptic tablet. They then tactually explored all four letters in normal or mirror-reversed form at different rotations (0°, 90°, 180°, and 270°) and indicated letter form (i.e., normal or mirror-reversed) by pressing one of two mouse buttons. We observed the typical effect of rotation angle on object discrimination performance (i.e., greater deviation from 0° resulted in worse performance) for trained letters, consistent with mental rotation of these haptically-rendered objects. We likewise observed generally slower and less accurate performance with mirror-reversed compared to prototypically oriented stimuli. Our findings extend existing research in multisensory object recognition by indicating that a new technology simulating active haptic feedback can support the generation and spatial manipulation of mental representations of objects. Thus, such haptic tablets can offer a new avenue to mitigate visual impairments and train skills dependent on mental object-based representations and their spatial manipulation. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
8. Multisensory Processes: A Balancing Act across the Lifespan.
- Author
-
Murray, Micah M., Lewkowicz, David J., Amedi, Amir, and Wallace, Mark T.
- Subjects
- *
PERCEPTUAL motor learning , *SENSORIMOTOR integration , *AGING , *BRAIN function localization , *DYNAMICAL systems - Abstract
Multisensory processes are fundamental in scaffolding perception, cognition, learning, and behavior. How and when stimuli from different sensory modalities are integrated rather than treated as separate entities is poorly understood. We review how the relative reliance on stimulus characteristics versus learned associations dynamically shapes multisensory processes. We illustrate the dynamism in multisensory function across two timescales: one long term that operates across the lifespan and one short term that operates during the learning of new multisensory relations. In addition, we highlight the importance of task contingencies. We conclude that these highly dynamic multisensory processes, based on the relative weighting of stimulus characteristics and learned associations, provide both stability and flexibility to brain functions over a wide range of temporal scales. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
9. The multisensory function of the human primary visual cortex.
- Author
-
Murray, Micah M., Thelen, Antonia, Thut, Gregor, Romei, Vincenzo, Martuzzi, Roberto, and Matusz, Pawel J.
- Subjects
- *
VISUAL cortex , *BRAIN imaging , *PERCEPTUAL motor learning , *NEOCORTEX , *HEMODYNAMICS , *BRAIN mapping - Abstract
It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERP/ERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
10. The role of auditory cortices in the retrieval of single-trial auditory-visual object memories.
- Author
-
Matusz, Pawel J., Thelen, Antonia, Amrein, Sarah, Geiser, Eveline, Anken, Jacques, and Murray, Micah M.
- Subjects
AUDITORY cortex ,AUDITORY perception ,PERCEPTUAL motor learning ,EVOKED potentials (Electrophysiology) ,SELF-congruence - Abstract
Single-trial encounters with multisensory stimuli affect both memory performance and early-latency brain responses to visual stimuli. Whether and how auditory cortices support memory processes based on single-trial multisensory learning is unknown and may differ qualitatively and quantitatively from comparable processes within visual cortices due to purported differences in memory capacities across the senses. We recorded event-related potentials ( ERPs) as healthy adults ( n = 18) performed a continuous recognition task in the auditory modality, discriminating initial (new) from repeated (old) sounds of environmental objects. Initial presentations were either unisensory or multisensory; the latter entailed synchronous presentation of a semantically congruent or a meaningless image. Repeated presentations were exclusively auditory, thus differing only according to the context in which the sound was initially encountered. Discrimination abilities (indexed by d') were increased for repeated sounds that were initially encountered with a semantically congruent image versus sounds initially encountered with either a meaningless or no image. Analyses of ERPs within an electrical neuroimaging framework revealed that early stages of auditory processing of repeated sounds were affected by prior single-trial multisensory contexts. These effects followed from significantly reduced activity within a distributed network, including the right superior temporal cortex, suggesting an inverse relationship between brain activity and behavioural outcome on this task. The present findings demonstrate how auditory cortices contribute to long-term effects of multisensory experiences on auditory object discrimination. We propose a new framework for the efficacy of multisensory processes to impact both current multisensory stimulus processing and unisensory discrimination abilities later in time. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
11. Top-down control and early multisensory processes: chicken vs. egg.
- Author
-
De Meo, Rosanna, Murray, Micah M., Clarke, Stephanie, and Matusz, Pawel J.
- Subjects
PERCEPTUAL motor learning ,BEHAVIORAL research ,AUDITORY perception ,VISUAL perception ,TRANSCRANIAL magnetic stimulation - Abstract
The article deals with role of early-latency multi-sensory interactions (eMSI) as a marker of bottom-up multi-sensory processes facilitating perception and behavior independent of top-down attentional mental processes. Topics discussed include the interaction of the auditory perception with the visual perception, the result of studies which compared attended and unattended multi-sensory stimuli, and the effect of transcranial magnetic stimulation (TMS)-driven visual cortex activity on behavior.
- Published
- 2015
- Full Text
- View/download PDF
12. The Efficacy of Single-Trial Multisensory Memories.
- Author
-
Thelen, Antonia and Murray, Micah M.
- Subjects
- *
VISUAL perception , *AUDITORY perception , *SENSES , *MEMORY research - Abstract
This review article summarizes evidence that multisensory experiences at one point in time have long-lasting effects on subsequent unisensory visual and auditory object recognition. The efficacy of single-trial exposure to task-irrelevant multisensory events is its ability to modulate memory performance and brain activity to unisensory components of these events presented later in time. Object recognition (either visual or auditory) is enhanced if the initial multisensory experience had been semantically congruent and can be impaired if this multisensory pairing was either semantically incongruent or entailed meaningless information in the task-irrelevant modality, when compared to objects encountered exclusively in a unisensory context. Processes active during encoding cannot straightforwardly explain these effects; performance on all initial presentations was indistinguishable despite leading to opposing effects with stimulus repetitions. Brain responses to unisensory stimulus repetitions differ during early processing stages (~100 ms post-stimulus onset) according to whether or not they had been initially paired in a multisensory context. Plus, the network exhibiting differential responses varies according to whether or not memory performance is enhanced or impaired. The collective findings we review indicate that multisensory associations formed via single-trial learning exert influences on later unisensory processing to promote distinct object representations that manifest as differentiable brain networks whose activity is correlated with memory performance. These influences occur incidentally, despite many intervening stimuli, and are distinguishable from the encoding/learning processes during the formation of the multisensory associations. The consequences of multisensory interactions thus persist over time to impact memory retrieval and object discrimination. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
13. Auditory spatio-temporal brain dynamics and their consequences for multisensory interactions in humans
- Author
-
Murray, Micah M. and Spierer, Lucas
- Subjects
- *
AUDITORY cortex , *BRAIN function localization , *PERCEPTUAL motor learning , *AUDITORY perception , *HUMAN information processing , *SENSES - Abstract
Abstract: Recent multisensory research has emphasized the occurrence of early, low-level interactions in humans. As such, it is proving increasingly necessary to also consider the kinds of information likely extracted from the unisensory signals that are available at the time and location of these interaction effects. This review addresses current evidence regarding how the spatio-temporal brain dynamics of auditory information processing likely curtails the information content of multisensory interactions observable in humans at a given latency and within a given brain region. First, we consider the time course of signal propagation as a limitation on when auditory information (of any kind) can impact the responsiveness of a given brain region. Next, we overview the dual pathway model for the treatment of auditory spatial and object information ranging from rudimentary to complex environmental stimuli. These dual pathways are considered an intrinsic feature of auditory information processing, which are not only partially distinct in their associated brain networks, but also (and perhaps more importantly) manifest only after several tens of milliseconds of cortical signal processing. This architecture of auditory functioning would thus pose a constraint on when and in which brain regions specific spatial and object information are available for multisensory interactions. We then separately consider evidence regarding mechanisms and dynamics of spatial and object processing with a particular emphasis on when discriminations along either dimension are likely performed by specific brain regions. We conclude by discussing open issues and directions for future research. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
14. The costs of crossing paths and switching tasks between audition and vision
- Author
-
Murray, Micah M., Santis, Laura De, Thut, Gregor, and Wylie, Glenn R.
- Subjects
- *
COGNITION , *NEURAL circuitry , *SENSORY evaluation , *BIOLOGICAL neural networks - Abstract
Abstract: Switching from one functional or cognitive operation to another is thought to rely on executive/control processes. The efficacy of these processes may depend on the extent of overlap between neural circuitry mediating the different tasks; more effective task preparation (and by extension smaller switch costs) is achieved when this overlap is small. We investigated the performance costs associated with switching tasks and/or switching sensory modalities. Participants discriminated either the identity or spatial location of objects that were presented either visually or acoustically. Switch costs between tasks were significantly smaller when the sensory modality of the task switched versus when it repeated. This was the case irrespective of whether the pre-trial cue informed participants only of the upcoming task, but not sensory modality (Experiment 1) or whether the pre-trial cue was informative about both the upcoming task and sensory modality (Experiment 2). In addition, in both experiments switch costs between the senses were positively correlated when the sensory modality of the task repeated across trials and not when it switched. The collective evidence supports the independence of control processes mediating task switching and modality switching and also the hypothesis that switch costs reflect competitive interference between neural circuits. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
15. Plasticity in representations of environmental sounds revealed by electrical neuroimaging
- Author
-
Murray, Micah M., Camen, Christian, Spierer, Lucas, and Clarke, Stephanie
- Subjects
- *
EVOKED potentials (Electrophysiology) , *REACTION time , *ELECTROENCEPHALOGRAPHY , *ELECTROPHYSIOLOGY - Abstract
Abstract: The rapid and precise processing of environmental sounds contributes to communication functions as well as both object recognition and localization. Plasticity in (accessing) the neural representations of environmental sounds is likewise essential for an adaptive organism, in particular humans, and can be indexed by repetition priming. How the brain achieves such plasticity with representations of environmental sounds is presently unresolved. Electrical neuroimaging of 64-channel auditory evoked potentials (AEPs) in humans identified the spatio-temporal brain mechanisms of repetition priming involving sounds of environmental objects. Subjects performed an ‘oddball’ target detection task, based on the semantic category of stimuli (living vs. man-made objects). Repetition priming effects were observed behaviorally as a speeding of reaction times and electrophysiologically as a suppression of the strength of responses to repeated sound presentations over the 156–215 ms post-stimulus period. These effects of plasticity were furthermore localized, using statistical analyses of a distributed linear inverse solution, to the left middle temporal gyrus and superior temporal sulcus (BA22), which have been implicated in associating sounds with their abstract representations and actions. These effects are subsequent to and occur in different brain regions from what has been previously identified as the earliest discrimination of auditory object categories. Plasticity in associative-semantic, rather than perceptual-discriminative functions, may underlie repetition priming of sounds of objects. We present a multi-stage mechanism of auditory object processing akin to what has been described for visual object processing and which also provides a framework for accessing multisensory object representations. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
16. Occipital Transcranial Magnetic Stimulation Has Opposing Effects on Visual and Auditory Stimulus Detection: Implications for Multisensory Interactions.
- Author
-
Romei, Vincenzo, Murray, Micah M., Merabet, Lotfi B., and Thut, Gregor
- Subjects
- *
TRANSCRANIAL magnetic stimulation , *VISUAL cortex , *VISUAL evoked response , *AUDITORY evoked response , *BRAIN function localization - Abstract
Multisensory interactions occur early in time and in low-level cortical areas, including primary cortices. To test current models of early auditory-visual (AV) convergence in unisensory visual brain areas, we studied the effect of transcranial magnetic stimulation (TMS) of visual cortex on behavioral responses to unisensory (auditory or visual) or multisensory (simultaneous auditory-visual) stimulus presentation. Single-pulse TMS was applied over the occipital pole at short delays (30-150 ms) after external stimulus onset. Relative to TMS over a control site, reactions times (RTs) to unisensory visual stimuli were prolonged by TMS at 60-75 ms poststimulus onset (visual suppression effect), confirming stimulation of functional visual cortex. Conversely, RTs to unisensory auditory stimuli were significantly shortened when visual cortex was stimulated by TMS at the same delays (beneficial interaction effect of auditory stimulation and occipital TMS). No TMS-effect on RTs was observed for AV stimulation. The beneficial interaction effect of combined unisensory auditory and TMS-induced visual cortex stimulation matched and was correlated with the RT-facilitation after external multisensory AV stimulation without TMS, suggestive of multisensory interactions between the stimulus-evoked auditory and TMS-induced visual cortex activities. A follow-up experiment showed that auditory input enhances excitability within visual cortex itself (using phosphene-induction via TMS as a measure) over a similarly early time-window (75-120 ms). The collective data support a mechanism of early auditory-visual interactions that is mediated by auditory-driven sensitivity changes in visual neurons that coincide in time with the initial volleys of visual input. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
17. Auditory–visual multisensory interactions attenuate subsequent visual responses in humans
- Author
-
Meylan, Raphaël V. and Murray, Micah M.
- Subjects
- *
VISUAL cortex , *BRAIN imaging , *SENSORIMOTOR integration , *SENSE organs - Abstract
Abstract: Effects of multisensory interactions on how subsequent sensory inputs are processed remain poorly understood. We investigated whether multisensory interactions between rudimentary visual and auditory stimuli (flashes and beeps) affect later visual processing. A 2×3 design varied the number of flashes (1 or 2) with the number of beeps (0, 1, or 2) presented on each trial, such that ‘2F1B’ refers to the presentation of 2 flashes with 1 beep. Beeps, when present, were synchronous with the first flash, and pairs of stimuli within a trial were separated by 52 ms ISI. Subjects indicated the number of flashes presented. Electrical neuroimaging of 128-channel event-related potentials assessed both the electric field strength and topography. Isolation of responses a visual stimulus that was preceded by a multisensory event was achieved by calculating the difference between the 2F1B and 1F1B conditions, and responses to a visual stimulus preceded by a unisensory event were isolated by calculating the difference between the 2F0B and 1F0B conditions (MUL and VIS, respectively). Comparison of MUL and VIS revealed that the treatment of visual information was significantly attenuated ∼160 ms after the onset of the second flash when it was preceded by a multisensory event. Source estimations further indicated that this attenuation occurred within low-level visual cortices. Multisensory interactions are ongoing in low-level visual cortices and affect incoming sensory processing. These data provide evidence that multisensory interactions are not restricted in time and can dramatically influence the treatment of subsequent stimuli, opening new lines of multisensory research. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
18. How single-trial electrical neuroimaging contributes to multisensory research.
- Author
-
Andino, Sara L. Gonzalez, Murray, Micah M., Foxe, John J., and Menendez, Rolando Grave de Peralta
- Subjects
- *
BRAIN , *MEDICAL imaging systems , *SENSES , *NEUROPHYSIOLOGY , *ELECTROENCEPHALOGRAPHY - Abstract
This study details a method to statistically determine, on a millisecond scale and for individual subjects, those brain areas whose activity differs between experimental conditions, using single-trial scalp-recorded EEG data. To do this, we non-invasively estimated local field potentials (LFPs) using the ELECTRA distributed inverse solution and applied non-parametric statistical tests at each brain voxel and for each time point. This yields a spatio-temporal activation pattern of differential brain responses. The method is illustrated here in the analysis of auditory-somatosensory (AS) multisensory interactions in four subjects. Differential multisensory responses were temporally and spatially consistent across individuals, with onset at ~50 ms and superposition within areas of the posterior superior temporal cortex that have traditionally been considered auditory in their function. The close agreement of these results with previous investigations of AS multisensory interactions suggests that the present approach constitutes a reliable method for studying multisensory processing with the temporal and spatial resolution required to elucidate several existing questions in this field. In particular, the present analyses permit a more direct comparison between human and animal studies of multisensory interactions and can be extended to examine correlation between electrophysiological phenomena and behavior. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
19. The brain uses single-trial multisensory memories to discriminate without awareness
- Author
-
Murray, Micah M., Foxe, John J., and Wylie, Glenn R.
- Subjects
- *
MEMORY , *BRAIN , *MAGNETIC resonance imaging , *AUDITORY perception - Abstract
Abstract: Multisensory experiences enhance perceptions and facilitate memory retrieval processes, even when only unisensory information is available for accessing such memories. Using fMRI, we identified human brain regions involved in discriminating visual stimuli according to past multisensory vs. unisensory experiences. Subjects performed a completely orthogonal task, discriminating repeated from initial image presentations intermixed within a continuous recognition task. Half of initial presentations were multisensory, and all repetitions were exclusively visual. Despite only single-trial exposures to initial image presentations, accuracy in indicating image repetitions was significantly improved by past auditory–visual multisensory experiences over images only encountered visually. Similarly, regions within the lateral–occipital complex–areas typically associated with visual object recognition processes–were more active to visual stimuli with multisensory than unisensory pasts. Additional differential responses were observed in the anterior cingulate and frontal cortices. Multisensory experiences are registered by the brain even when of no immediate behavioral relevance and can be used to categorize memories. These data reveal the functional efficacy of multisensory processing. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
20. The role of multisensory memories in unisensory object discrimination
- Author
-
Lehmann, Sandra and Murray, Micah M.
- Subjects
- *
MEMORY , *EVOKED potentials (Electrophysiology) , *BRAIN research , *VISUAL perception - Abstract
Abstract: Past multisensory experiences can influence current unisensory processing and memory performance. Repeated images are better discriminated if initially presented as auditory–visual pairs, rather than only visually. An experience''s context thus plays a role in how well repetitions of certain aspects are later recognized. Here, we investigated factors during the initial multisensory experience that are essential for generating improved memory performance. Subjects discriminated repeated versus initial image presentations intermixed within a continuous recognition task. Half of initial presentations were multisensory, and all repetitions were only visual. Experiment 1 examined whether purely episodic multisensory information suffices for enhancing later discrimination performance by pairing visual objects with either tones or vibrations. We could therefore also assess whether effects can be elicited with different sensory pairings. Experiment 2 examined semantic context by manipulating the congruence between auditory and visual object stimuli within blocks of trials. Relative to images only encountered visually, accuracy in discriminating image repetitions was significantly impaired by auditory–visual, yet unaffected by somatosensory–visual multisensory memory traces. By contrast, this accuracy was selectively enhanced for visual stimuli with semantically congruent multisensory pasts and unchanged for those with semantically incongruent multisensory pasts. The collective results reveal opposing effects of purely episodic versus semantic information from auditory–visual multisensory events. Nonetheless, both types of multisensory memory traces are accessible for processing incoming stimuli and indeed result in distinct visual object processing, leading to either impaired or enhanced performance relative to unisensory memory traces. We discuss these results as supporting a model of object-based multisensory interactions. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
21. Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging
- Author
-
Murray, Micah M., Michel, Christoph M., Grave de Peralta, Rolando, Ortigue, Stephanie, Brunet, Denis, Gonzalez Andino, Sara, and Schnider, Armin
- Subjects
- *
MULTISENSOR data fusion , *NEUROLOGY , *DIAGNOSTIC imaging , *VISUAL perception - Abstract
Though commonly held that multisensory experiences enrich our memories and that memories influence ongoing sensory processes, their neural mechanisms remain unresolved. Here, electrical neuroimaging shows that auditory–visual multisensory experiences alter subsequent processing of unisensory visual stimuli during the same block of trials at early stages poststimulus onset and within visual object recognition areas. We show this with a stepwise analysis of scalp-recorded event-related potentials (ERPs) that statistically tested (1) ERP morphology and amplitude, (2) global electric field power, (3) topographic stability of and changes in the electric field configuration, and (4) intracranial distributed linear source estimations. Subjects performed a continuous recognition task, discriminating repeated vs. initial image presentations. Corresponding, but task-irrelevant, sounds accompanied half of the initial presentations during a given block of trials. On repeated presentations within a block of trials, only images appeared, yielding two situations—the image''s prior presentation was only visual or with a sound. Image repetitions that had been accompanied by sounds yielded improved memory performance accuracy (old or new discrimination) and were differentiated as early as ∼ 60–136 ms from images that had not been accompanied by sounds through generator changes in areas of the right lateral–occipital complex (LOC). It thus appears that unisensory percepts trigger multisensory representations associated with them. The collective data support the hypothesis that perceptual or memory traces for multisensory auditory–visual events involve a distinct cortical network that is rapidly activated by subsequent repetition of just the unisensory visual component. [Copyright &y& Elsevier]
- Published
- 2004
- Full Text
- View/download PDF
22. A multisensory perspective on object memory.
- Author
-
Matusz, Pawel J., Wallace, Mark T., and Murray, Micah M.
- Subjects
- *
MOTOR ability , *MOTOR learning , *STIMULUS & response (Psychology) , *NEUROPSYCHOLOGY , *BRAIN research - Abstract
Traditional studies of memory and object recognition involved objects presented within a single sensory modality (i.e., purely visual or purely auditory objects). However, in naturalistic settings, objects are often evaluated and processed in a multisensory manner. This begets the question of how object representations that combine information from the different senses are created and utilised by memory functions. Here we review research that has demonstrated that a single multisensory exposure can influence memory for both visual and auditory objects. In an old/new object discrimination task, objects that were presented initially with a task-irrelevant stimulus in another sense were better remembered compared to stimuli presented alone, most notably when the two stimuli were semantically congruent. The brain discriminates between these two types of object representations within the first 100 ms post-stimulus onset, indicating early “tagging” of objects/events by the brain based on the nature of their initial presentation context. Interestingly, the specific brain networks supporting the improved object recognition vary based on a variety of factors, including the effectiveness of the initial multisensory presentation and the sense that is task-relevant. We specify the requisite conditions for multisensory contexts to improve object discrimination following single exposures, and the individual differences that exist with respect to these improvements. Our results shed light onto how memory operates on the multisensory nature of object representations as well as how the brain stores and retrieves memories of objects. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
23. Determinants of the efficacy of single-trial multisensory learning.
- Author
-
Thelen, Antonia and Murray, Micah M.
- Subjects
- *
OBJECT recognition (Computer vision) , *AUDITORY perception , *MEMORY - Abstract
Single-trial multisensory learning has been reliably shown to impact the later ability to discriminate images. The present study had the following three aims: (1) to determine if single-trial multisensory learning would elicit corresponding effects on auditory discrimination, (2) to determine if there were links between the impact of multisensory sensory learning on auditory discrimination and its impact on visual discrimination within individual participants, and (3) to determine the bases of inter-individual differences in the efficacy of single-trial multisensory learning. On two sessions separated by one week, participants discriminated initial from repeated presentations of either images or sounds during a continuous recognition task. Half of the initial presentations were auditory–visual multisensory parings (semantically congruent, semantically incongruent, or meaningless). The remaining half of initial presentations was unisensory. Half of the repeated stimuli were presented in an identical manner to their initial encounter, and the remaining half were presented in the complementary manner (i.e., those initially presented in a unisensory manner were now presented as multisensory pairs and vice versa). The results show that the efficacy of single-trial multisensory learning across the senses varies according to an individual's propensity to exhibit repetition priming with sounds (i.e., faster RTs and higher accuracy for repeated vs. initial unisensory sound presentations). Individuals exhibiting such priming also showed facilitative effects of single-trial multisensory learning on both the auditory and visual discrimination tasks. Those who did not exhibit such priming did not benefit from single-trial multisensory learning. Single-trial multisensory learning is therefore an effective tool across the senses. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
24. Electrical neuroimaging of memory discrimination based on single-trial multisensory learning.
- Author
-
Thelen, Antonia, Cappe, Céline, and Murray, Micah M.
- Subjects
EVOKED potentials (Electrophysiology) ,VISUAL evoked potentials ,PSYCHOPHYSICS - Abstract
Multisensory experiences influence subsequent memory performance and brain responses. Studies have thus far concentrated on semantically congruent pairings, leaving unresolved the influence of stimulus pairing and memory sub-types. Here, we paired images with unique, meaningless sounds during a continuous recognition task to determine if purely episodic, single-trial multisensory experiences can incidentally impact subsequent visual object discrimination. Psychophysics and electrical neuroimaging analyses of visual evoked potentials (VEPs) compared responses to repeated images either paired or not with a meaningless sound during initial encounters. Recognition accuracy was significantly impaired for images initially presented as multisensory pairs and could not be explained in terms of differential attention or transfer of effects from encoding to retrieval. VEP modulations occurred at 100–130 and 270–310 ms and stemmed from topographic differences indicative of network configuration changes within the brain. Distributed source estimations localized the earlier effect to regions of the right posterior temporal gyrus (STG) and the later effect to regions of the middle temporal gyrus (MTG). Responses in these regions were stronger for images previously encountered as multisensory pairs. Only the later effect correlated with performance such that greater MTG activity in response to repeated visual stimuli was linked with greater performance decrements. The present findings suggest that brain networks involved in this discrimination may critically depend on whether multisensory events facilitate or impair later visual memory performance. More generally, the data support models whereby effects of multisensory interactions persist to incidentally affect subsequent behavior as well as visual processing during its initial stages. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
25. The context-contingent nature of cross-modal activations of the visual cortex.
- Author
-
Matusz, Pawel J., Retsa, Chrysa, and Murray, Micah M.
- Subjects
- *
VISUAL cortex , *OCCIPITAL lobe , *BRAIN imaging , *NEUROSCIENCES , *BRAIN research - Abstract
Real-world environments are nearly always multisensory in nature. Processing in such situations confers perceptual advantages, but its automaticity remains poorly understood. Automaticity has been invoked to explain the activation of visual cortices by laterally-presented sounds. This has been observed even when the sounds were task-irrelevant and spatially uninformative about subsequent targets. An auditory-evoked contralateral occipital positivity (ACOP) at ~ 250 ms post-sound onset has been postulated as the event-related potential (ERP) correlate of this cross-modal effect. However, the spatial dimension of the stimuli was nevertheless relevant in virtually all prior studies where the ACOP was observed. By manipulating the implicit predictability of the location of lateralised sounds in a passive auditory paradigm, we tested the automaticity of cross-modal activations of visual cortices. 128-channel ERP data from healthy participants were analysed within an electrical neuroimaging framework. The timing, topography, and localisation resembled previous characterisations of the ACOP. However, the cross-modal activations of visual cortices by sounds were critically dependent on whether the sound location was (un)predictable. Our results are the first direct evidence that this particular cross-modal process is not (fully) automatic; instead, it is context-contingent. More generally, the present findings provide novel insights into the importance of context-related factors in controlling information processing across the senses, and call for a revision of current models of automaticity in cognitive sciences. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
26. Single-trial multisensory memories affect later auditory and visual object discrimination.
- Author
-
Thelen, Antonia, Talsma, Durk, and Murray, Micah M.
- Subjects
- *
PERCEPTUAL motor learning , *MEMORY , *AUDITORY perception , *VISUAL perception , *GENERALIZATION , *PROBABILITY theory - Abstract
Multisensory memory traces established via single-trial exposures can impact subsequent visual object recognition. This impact appears to depend on the meaningfulness of the initial multisensory pairing, implying that multisensory exposures establish distinct object representations that are accessible during later unisensory processing. Multisensory contexts may be particularly effective in influencing auditory discrimination, given the purportedly inferior recognition memory in this sensory modality. The possibility of this generalization and the equivalence of effects when memory discrimination was being performed in the visual vs. auditory modality were at the focus of this study. First, we demonstrate that visual object discrimination is affected by the context of prior multisensory encounters, replicating and extending previous findings by controlling for the probability of multisensory contexts during initial as well as repeated object presentations. Second, we provide the first evidence that single-trial multisensory memories impact subsequent auditory object discrimination. Auditory object discrimination was enhanced when initial presentations entailed semantically congruent multisensory pairs and was impaired after semantically incongruent multisensory encounters, compared to sounds that had been encountered only in a unisensory manner. Third, the impact of single-trial multisensory memories upon unisensory object discrimination was greater when the task was performed in the auditory vs. visual modality. Fourth, there was no evidence for correlation between effects of past multisensory experiences on visual and auditory processing, suggestive of largely independent object processing mechanisms between modalities. We discuss these findings in terms of the conceptual short term memory (CSTM) model and predictive coding. Our results suggest differential recruitment and modulation of conceptual memory networks according to the sensory task at hand. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
27. Auditory–somatosensory multisensory interactions in humans: Dissociating detection and spatial discrimination
- Author
-
Sperdin, Holger F., Cappe, Céline, and Murray, Micah M.
- Subjects
- *
SOMATOSENSORY evoked potentials , *PERCEPTUAL motor learning , *DISCRIMINATION learning , *REACTION time , *SPATIAL behavior , *SELECTIVITY (Psychology) , *ATTENTION - Abstract
Abstract: Simple reaction times (RTs) to auditory–somatosensory (AS) multisensory stimuli are facilitated over their unisensory counterparts both when stimuli are delivered to the same location and when separated. In two experiments we addressed the possibility that top-down and/or task-related influences can dynamically impact the spatial representations mediating these effects and the extent to which multisensory facilitation will be observed. Participants performed a simple detection task in response to auditory, somatosensory, or simultaneous AS stimuli that in turn were either spatially aligned or misaligned by lateralizing the stimuli. Additionally, we also informed the participants that they would be retrogradely queried (one-third of trials) regarding the side where a given stimulus in a given sensory modality was presented. In this way, we sought to have participants attending to all possible spatial locations and sensory modalities, while nonetheless having them perform a simple detection task. Experiment 1 provided no cues prior to stimulus delivery. Experiment 2 included spatially uninformative cues (50% of trials). In both experiments, multisensory conditions significantly facilitated detection RTs with no evidence for differences according to spatial alignment (though general benefits of cuing were observed in Experiment 2). Facilitated detection occurs even when attending to spatial information. Performance with probes, quantified using sensitivity (d′), was impaired following multisensory trials in general and significantly more so following misaligned multisensory trials. This indicates that spatial information is not available, despite being task-relevant. The collective results support a model wherein early AS interactions may result in a loss of spatial acuity for unisensory information. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
28. Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study
- Author
-
Molholm, Sophie, Ritter, Walter, Murray, Micah M., Javitt, Daniel C., Schroeder, Charles E., and Foxe, John J.
- Subjects
- *
SENSORIMOTOR integration , *ELECTROPHYSIOLOGY - Abstract
Integration of information from multiple senses is fundamental to perception and cognition, but when and where this is accomplished in the brain is not well understood. This study examined the timing and topography of cortical auditory–visual interactions using high-density event-related potentials (ERPs) during a simple reaction-time (RT) task. Visual and auditory stimuli were presented alone and simultaneously. ERPs elicited by the auditory and visual stimuli when presented alone were summed (‘sum’ ERP) and compared to the ERP elicited when they were presented simultaneously (‘simultaneous’ ERP). Divergence between the ‘simultaneous’ and ‘sum’ ERP indicated auditory–visual (AV) neural response interactions. There was a surprisingly early right parieto-occipital AV interaction, consistent with the finding of an earlier study [J. Cogn. Neurosci. 11 (1999) 473]. The timing of onset of this effect (46 ms) was essentially simultaneous with the onset of visual cortical processing, as indexed by the onset of the visual C1 component, which is thought to represent the earliest cortical visual evoked potential. The coincident timing of the early AV interaction and C1 strongly suggests that AV interactions can affect early visual sensory processing. Additional AV interactions were found within the time course of sensory processing (up to 200 ms post stimulus onset). In total, this system of AV effects over the scalp was suggestive of both activity unique to multisensory processing, and the modulation of ‘unisensory’ activity. RTs to the stimuli when presented simultaneously were significantly faster than when they were presented alone. This RT facilitation could not be accounted for by probability summation, as evidenced by violation of the ‘race’ model, providing compelling evidence that auditory–visual neural interactions give rise to this RT effect. [Copyright &y& Elsevier]
- Published
- 2002
- Full Text
- View/download PDF
29. Selective integration of auditory-visual looming cues by humans
- Author
-
Cappe, Céline, Thut, Gregor, Romei, Vincenzo, and Murray, Micah M.
- Subjects
- *
AUDITORY perception , *VISUAL perception , *MOTION perception (Vision) , *PSYCHOLOGY of movement , *PSYCHOPHYSICS , *NEUROPHYSIOLOGY - Abstract
Abstract: An object''s motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
30. Auditory–somatosensory multisensory interactions are spatially modulated by stimulated body surface and acoustic spectra
- Author
-
Tajadura-Jiménez, Ana, Kitagawa, Norimichi, Väljamäe, Aleksander, Zampini, Massimiliano, Murray, Micah M., and Spence, Charles
- Subjects
- *
AUDITORY perception , *PERSONAL space , *BRAIN function localization , *BRAIN stimulation , *NEUROPSYCHOLOGY research , *BRAIN mapping - Abstract
Abstract: Previous research has provided inconsistent results regarding the spatial modulation of auditory–somatosensory interactions. The present study reports three experiments designed to investigate the nature of these interactions in the space close to the head. Human participants made speeded detection responses to unimodal auditory, somatosensory, or simultaneous auditory–somatosensory stimuli. In Experiment 1, electrocutaneous stimuli were presented to either earlobe, while auditory stimuli were presented from the same versus opposite sides, and from one of two distances (20 vs. 70cm) from the participant''s head. The results demonstrated a spatial modulation of auditory–somatosensory interactions when auditory stimuli were presented from close to the head. In Experiment 2, electrocutaneous stimuli were delivered to the hands, which were placed either close to or far from the head, while the auditory stimuli were again presented at one of two distances. The results revealed that the spatial modulation observed in Experiment 1 was specific to the particular body part stimulated (head) rather than to the region of space (i.e. around the head) where the stimuli were presented. The results of Experiment 3 demonstrate that sounds that contain high-frequency components are particularly effective in eliciting this auditory–somatosensory spatial effect. Taken together, these findings help to resolve inconsistencies in the previous literature and suggest that auditory–somatosensory multisensory integration is modulated by the stimulated body surface and acoustic spectra of the stimuli presented. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
31. Towards understanding how we pay attention in naturalistic visual search settings.
- Author
-
Turoman, Nora, Tivadar, Ruxandra I., Retsa, Chrysa, Murray, Micah M., and Matusz, Pawel J.
- Subjects
- *
VISUAL perception , *ATTENTION control , *TIME , *LARGE-scale brain networks - Abstract
Research on attentional control has largely focused on single senses and the importance of behavioural goals in controlling attention. However, everyday situations are multisensory and contain regularities, both likely influencing attention. We investigated how visual attentional capture is simultaneously impacted by top-down goals, the multisensory nature of stimuli, and the contextual factors of stimuli's semantic relationship and temporal predictability. Participants performed a multisensory version of the Folk et al. (1992) spatial cueing paradigm, searching for a target of a predefined colour (e.g. a red bar) within an array preceded by a distractor. We manipulated: 1) stimuli's goal-relevance via distractor's colour (matching vs. mismatching the target), 2) stimuli's multisensory nature (colour distractors appearing alone vs. with tones), 3) the relationship between the distractor sound and colour (arbitrary vs. semantically congruent) and 4) the temporal predictability of distractor onset. Reaction-time spatial cueing served as a behavioural measure of attentional selection. We also recorded 129-channel event-related potentials (ERPs), analysing the distractor-elicited N2pc component both canonically and using a multivariate electrical neuroimaging framework. Behaviourally, arbitrary target-matching distractors captured attention more strongly than semantically congruent ones, with no evidence for context modulating multisensory enhancements of capture. Notably, electrical neuroimaging of surface-level EEG analyses revealed context-based influences on attention to both visual and multisensory distractors, in how strongly they activated the brain and type of activated brain networks. For both processes, the context-driven brain response modulations occurred long before the N2pc time-window, with topographic (network-based) modulations at ∼30 ms, followed by strength-based modulations at ∼100 ms post-distractor onset. Our results reveal that both stimulus meaning and predictability modulate attentional selection, and they interact while doing so. Meaning, in addition to temporal predictability, is thus a second source of contextual information facilitating goal-directed behaviour. More broadly, in everyday situations, attention is controlled by an interplay between one's goals, stimuli's perceptual salience, meaning and predictability. Our study calls for a revision of attentional control theories to account for the role of contextual and multisensory control. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
32. Auditory–somatosensory multisensory interactions in front and rear space
- Author
-
Zampini, Massimiliano, Torresan, Diego, Spence, Charles, and Murray, Micah M.
- Subjects
- *
PERCEPTUAL motor learning , *SOMATOSENSORY evoked potentials , *SPATIAL behavior , *REACTION time - Abstract
Abstract: The information conveyed by our senses can be combined to facilitate perception and behaviour. One focus of recent research has been on the factors governing such facilitatory multisensory interactions. The spatial register of neuronal receptive fields (RFs) appears to be a prerequisite for multisensory enhancement. In terms of auditory–somatosensory (AS) interactions, facilitatory effects on simple reaction times and on brain responses have been demonstrated in caudo-medial auditory cortices, both when auditory and somatosensory stimuli are presented to the same spatial location and also when they are separated by 100° in frontal space. One implication is that these brain regions contain large spatial RFs. The present study further investigated this possibility and, in particular, the question of whether AS interactions are restricted to frontal space, since recent research has revealed some fundamental differences between the sensory processing of stimuli in front and rear space. Twelve participants performed a simple reaction time task to auditory, somatosensory, or simultaneous auditory–somatosensory stimuli. The participants placed one of their arms in front of them and the other behind their backs. Loudspeakers were placed close to each hand. Thus, there were a total of eight stimulus conditions – four unisensory and four multisensory – including all possible combinations of posture and loudspeaker location. A significant facilitation of reaction times (RTs), exceeding that predicted by probability summation, was obtained following multisensory stimulation, irrespective of whether the stimuli were in spatial register or not. These results are interpreted in terms of the likely RF organization of previously identified auditory–somatosensory brain regions. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
33. The COGs (context, object, and goals) in multisensory processing
- Author
-
Vincenzo Romei, Micah M. Murray, Pawel J. Matusz, Sanne ten Oever, Salvador Soto-Faraco, Nienke van Atteveldt, Cognition, RS: FPN CN 4, RS: FPN CN 1, ten Oever, Sanne, Romei, Vincenzo, van Atteveldt, Nienke, Soto-Faraco, Salvador, Murray, Micah M., Matusz, Pawel J., LEARN!, Educational Neuroscience, LEARN! - Social cognition and learning, and LEARN! - Brain, learning and development
- Subjects
Male ,media_common.quotation_subject ,Context object ,Review ,Stimulus (physiology) ,Research Support ,Brain mapping ,050105 experimental psychology ,Goal ,03 medical and health sciences ,0302 clinical medicine ,Object ,Physical Stimulation ,Perception ,Cognitive level ,Audio visual ,Control ,Journal Article ,Humans ,0501 psychology and cognitive sciences ,Attention ,Non-U.S. Gov't ,media_common ,Neuroscience (all) ,Research Support, Non-U.S. Gov't ,General Neuroscience ,Multisensory ,05 social sciences ,Brain ,Audio-visual ,Top-down and bottom-up design ,Top-down ,Female ,Bottom-up ,Psychology ,Goals ,030217 neurology & neurosurgery ,Human ,Cognitive psychology - Abstract
Our understanding of how perception operates in real-world environments has been substantially advanced by studying both multisensory processes and “top-down” control processes influencing sensory processing via activity from higher-order brain areas, such as attention, memory, and expectations. As the two topics have been traditionally studied separately, the mechanisms orchestrating real-world multisensory processing remain unclear. Past work has revealed that the observer’s goals gate the influence of many multisensory processes on brain and behavioural responses, whereas some other multisensory processes might occur independently of these goals. Consequently, other forms of top-down control beyond goal dependence are necessary to explain the full range of multisensory effects currently reported at the brain and the cognitive level. These forms of control include sensitivity to stimulus context as well as the detection of matches (or lack thereof) between a multisensory stimulus and categorical attributes of naturalistic objects (e.g. tools, animals). In this review we discuss and integrate the existing findings that demonstrate the importance of such goal-, object- and context-based top-down control over multisensory processing. We then put forward a few principles emerging from this literature review with respect to the mechanisms underlying multisensory processing and discuss their possible broader implications. This research was supported by grants from the Ministerio de Economia y Competitividad (PSI2013-42626-P), AGAUR Generalitat de Catalunya (2014SGR856), and the European Research Council (StG-2010 263145) to S.S-F, and the Swiss National Science Foundation (Grant #320030-149982 as well as the National Centre of Competence in Research project “SYNAPSY, The Synaptic Bases of Mental Disease” [Project 51AU40-125759]) and the Swiss Brain League (2014 Research Prize) to MMM. StO receives support from the Dutch Organisation for Scientific Research (Grant 406-11-068).
- Published
- 2016
34. The multisensory function of the human primary visual cortex
- Author
-
Antonia Thelen, Roberto Martuzzi, Pawel J. Matusz, Vincenzo Romei, Gregor Thut, Micah M. Murray, Murray, Micah M., Thelen, Antonia, Thut, Gregor, Romei, Vincenzo, Martuzzi, Roberto, and Matusz, Pawel J.
- Subjects
genetic structures ,Afferent Pathway ,Vision ,media_common.quotation_subject ,Cognitive Neuroscience ,Neuroimaging ,Experimental and Cognitive Psychology ,Brain imaging ,Electroencephalography ,Brain mapping ,050105 experimental psychology ,03 medical and health sciences ,Behavioral Neuroscience ,0302 clinical medicine ,Primary cortex ,Cross-modal ,Perception ,Physical Stimulation ,medicine ,Humans ,0501 psychology and cognitive sciences ,Function (engineering) ,media_common ,Visual Cortex ,Afferent Pathways ,Neocortex ,medicine.diagnostic_test ,Multisensory ,05 social sciences ,medicine.anatomical_structure ,Visual cortex ,Human brain imaging ,Psychology ,Neuroscience ,030217 neurology & neurosurgery ,Cognitive psychology ,Human - Abstract
It has been nearly 10 years since Ghazanfar and Schroeder (2006) proposed that the neocortex is essentially multisensory in nature. However, it is only recently that sufficient and hard evidence that supports this proposal has accrued. We review evidence that activity within the human primary visual cortex plays an active role in multisensory processes and directly impacts behavioural outcome. This evidence emerges from a full pallet of human brain imaging and brain mapping methods with which multisensory processes are quantitatively assessed by taking advantage of particular strengths of each technique as well as advances in signal analyses. Several general conclusions about multisensory processes in primary visual cortex of humans are supported relatively solidly. First, haemodynamic methods (fMRI/PET) show that there is both convergence and integration occurring within primary visual cortex. Second, primary visual cortex is involved in multisensory processes during early post-stimulus stages (as revealed by EEG/ERNERFs as well as TMS). Third, multisensory effects in primary visual cortex directly impact behaviour and perception, as revealed by correlational (EEG/ERPs/ERFs) as well as more causal measures (TMS/tACS). While the provocative claim of Ghazanfar and Schroeder (2006) that the whole of neocortex is multisensory in function has yet to be demonstrated, this can now be considered established in the case of the human primary visual cortex. (C) 2015 Elsevier Ltd. All rights reserved.
- Published
- 2016
- Full Text
- View/download PDF
35. Selective attention to sound features mediates cross-modal activation of visual cortices.
- Author
-
Retsa, Chrysa, Matusz, Pawel J., Schnupp, Jan W.H., and Murray, Micah M.
- Subjects
- *
VISUAL cortex , *SELECTIVITY (Psychology) , *AUDITORY evoked response - Abstract
Contemporary schemas of brain organization now include multisensory processes both in low-level cortices as well as at early stages of stimulus processing. Evidence has also accumulated showing that unisensory stimulus processing can result in cross-modal effects. For example, task-irrelevant and lateralised sounds can activate visual cortices; a phenomenon referred to as the auditory-evoked contralateral occipital positivity (ACOP). Some claim this is an example of automatic attentional capture in visual cortices. Other results, however, indicate that context may play a determinant role. Here, we investigated whether selective attention to spatial features of sounds is a determining factor in eliciting the ACOP. We recorded high-density auditory evoked potentials (AEPs) while participants selectively attended and discriminated sounds according to four possible stimulus attributes: location, pitch, speaker identity or syllable. Sound acoustics were held constant, and their location was always equiprobable (50% left, 50% right). The only manipulation was to which sound dimension participants attended. We analysed the AEP data from healthy participants within an electrical neuroimaging framework. The presence of sound-elicited activations of visual cortices depended on the to-be-discriminated, goal-based dimension. The ACOP was elicited only when participants were required to discriminate sound location, but not when they attended to any of the non-spatial features. These results provide a further indication that the ACOP is not automatic. Moreover, our findings showcase the interplay between task-relevance and spatial (un)predictability in determining the presence of the cross-modal activation of visual cortices. • Crossmodal facilitation of information processing is well documented. • Lateralised sounds activate contralateral visual cortices – the ACOP. • Selective attention to spatial features of sounds is a determinant of the ACOP. • ACOP followed from ipsilateral suppression. • The ACOP depends on both stimulus regularities and task-relevance. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
36. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds
- Author
-
Céline Cappe, Gregor Thut, Micah M. Murray, Vincenzo Romei, Romei, Vincenzo, Murray, Micah M., Cappe, Cã©line, and Thut, Gregor
- Subjects
Visual perception ,genetic structures ,medicine.medical_treatment ,Sensory system ,Stimulus (physiology) ,Biology ,050105 experimental psychology ,General Biochemistry, Genetics and Molecular Biology ,03 medical and health sciences ,0302 clinical medicine ,Looming ,Psychophysics ,medicine ,Humans ,0501 psychology and cognitive sciences ,10. No inequality ,Visual Cortex ,Biochemistry, Genetics and Molecular Biology (all) ,Agricultural and Biological Sciences(all) ,Biochemistry, Genetics and Molecular Biology(all) ,Multisensory ,05 social sciences ,Transcranial Magnetic Stimulation ,Transcranial magnetic stimulation ,Sound ,Phosphene ,Visual cortex ,medicine.anatomical_structure ,Agricultural and Biological Sciences (all) ,Acoustic Stimulation ,Auditory Perception ,SYSNEURO ,General Agricultural and Biological Sciences ,Neuroscience ,030217 neurology & neurosurgery ,Human - Abstract
Evidence of multisensory interactions within low-level cortices and at early post-stimulus latencies [1-6] has prompted a paradigm shift in conceptualizations of sensory organization [7-10]. However, the mechanisms of these interactions and their link to behavior remain largely unknown. One behaviorally salient stimulus is a rapidly approaching (looming) object, which can indicate potential threats [11-13]. Based on findings from humans [14] and nonhuman primates [15, 16] suggesting there to be selective multisensory (auditory-visual) integration of looming signals, we tested whether looming sounds would selectively modulate the excitability of visual cortex. We combined transcranial magnetic stimulation (TMS) over the occipital pole and psychophysics for "neurometric" and psychometric assays of changes in low-level visual cortex excitability (i.e., phosphene induction) and perception, respectively [17, 18]. Across three experiments we show that structured looming sounds considerably enhance visual cortex excitability relative to other sound categories and white-noise controls. The time course of this effect showed that modulation of visual cortex excitability started to differ between looming and stationary sounds for sound portions of very short duration (80 ms) that were significantly below (by 35 ms) perceptual discrimination threshold. Visual perceptions are thus rapidly and efficiently boosted by sounds through early, preperceptual and stimulus-selective modulation of neuronal excitability within low-level visual cortex. © 2009 Elsevier Ltd. All rights reserved.
- Published
- 2009
- Full Text
- View/download PDF
37. Selective integration of auditory-visual looming cues by humans
- Author
-
Micah M. Murray, Céline Cappe, Vincenzo Romei, Gregor Thut, Cappe, Cã©line, Thut, Gregor, Romei, Vincenzo, and Murray, Micah M.
- Subjects
Adult ,Male ,genetic structures ,Adolescent ,media_common.quotation_subject ,Movement ,Cognitive Neuroscience ,Motion Perception ,Experimental and Cognitive Psychology ,Cue ,Young Adult ,Behavioral Neuroscience ,Looming ,Perception ,Psychophysics ,Reaction Time ,Humans ,Motion perception ,media_common ,Crossmodal ,Distance ,Multisensory ,Multisensory integration ,Cognition ,humanities ,Acoustic Stimulation ,Facilitation ,Psychophysic ,Auditory Perception ,Visual Perception ,Female ,Cues ,Psychology ,Photic Stimulation ,Cognitive psychology ,Human - Abstract
An object's motion relative to an observer can confer ethologically meaningful information. Approaching or looming stimuli can signal threats/collisions to be avoided or prey to be confronted, whereas receding stimuli can signal successful escape or failed pursuit. Using movement detection and subjective ratings, we investigated the multisensory integration of looming and receding auditory and visual information by humans. While prior research has demonstrated a perceptual bias for unisensory and more recently multisensory looming stimuli, none has investigated whether there is integration of looming signals between modalities. Our findings reveal selective integration of multisensory looming stimuli. Performance was significantly enhanced for looming stimuli over all other multisensory conditions. Contrasts with static multisensory conditions indicate that only multisensory looming stimuli resulted in facilitation beyond that induced by the sheer presence of auditory-visual stimuli. Controlling for variation in physical energy replicated the advantage for multisensory looming stimuli. Finally, only looming stimuli exhibited a negative linear relationship between enhancement indices for detection speed and for subjective ratings. Maximal detection speed was attained when motion perception was already robust under unisensory conditions. The preferential integration of multisensory looming stimuli highlights that complex ethologically salient stimuli likely require synergistic cooperation between existing principles of multisensory integration. A new conceptualization of the neurophysiologic mechanisms mediating real-world multisensory perceptions and action is therefore supported. © 2008 Elsevier Ltd. All rights reserved.
- Published
- 2009
38. Occipital Transcranial Magnetic Stimulation Has Opposing Effects on Visual and Auditory Stimulus Detection: Implications for Multisensory Interactions
- Author
-
Micah M. Murray, Vincenzo Romei, Lotfi B. Merabet, Gregor Thut, Romei, Vincenzo, Murray, Micah M., Merabet, Lotfi B., and Thut, Gregor
- Subjects
Adult ,Male ,Visual perception ,genetic structures ,media_common.quotation_subject ,medicine.medical_treatment ,Stimulation ,Stimulus (physiology) ,behavioral disciplines and activities ,Primary visual cortex ,Perception ,Moro reflex ,medicine ,Reaction Time ,Humans ,Auditory ,media_common ,Visual Cortex ,Neuroscience (all) ,Crossmodal ,Multisensory ,General Neuroscience ,musculoskeletal, neural, and ocular physiology ,Articles ,Transcranial Magnetic Stimulation ,eye diseases ,Electric Stimulation ,Transcranial magnetic stimulation ,Visual cortex ,medicine.anatomical_structure ,Acoustic Stimulation ,TMS ,Auditory Perception ,Visual Perception ,Female ,Visual ,Psychology ,Neuroscience ,Photic Stimulation ,Psychomotor Performance ,psychological phenomena and processes ,Human - Abstract
Multisensory interactions occur early in time and in low-level cortical areas, including primary cortices. To test current models of early auditory-visual (AV) convergence in unisensory visual brain areas, we studied the effect of transcranial magnetic stimulation (TMS) of visual cortex on behavioral responses to unisensory (auditory or visual) or multisensory (simultaneous auditory-visual) stimulus presentation. Single-pulse TMS was applied over the occipital pole at short delays (30-150 ms) after external stimulus onset. Relative to TMS over a control site, reactions times (RTs) to unisensory visual stimuli were prolonged by TMS at 60-75 ms poststimulus onset (visual suppression effect), confirming stimulation of functional visual cortex. Conversely, RTs to unisensory auditory stimuli were significantly shortened when visual cortex was stimulated by TMS at the same delays (beneficial interaction effect of auditory stimulation and occipital TMS). No TMS-effect on RTs was observed for AV stimulation. The beneficial interaction effect of combined unisensory auditory and TMS-induced visual cortex stimulation matched and was correlated with the RT-facilitation after external multisensory AV stimulation without TMS, suggestive of multisensory interactions between the stimulus-evoked auditory and TMS-induced visual cortex activities. A follow-up experiment showed that auditory input enhances excitability within visual cortex itself (using phosphene-induction via TMS as a measure) over a similarly early time-window (75-120 ms). The collective data support a mechanism of early auditory-visual interactions that is mediated by auditory-driven sensitivity changes in visual neurons that coincide in time with the initial volleys of visual input. Copyright © 2007 Society for Neuroscience.
- Published
- 2007
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.