99 results on '"Jennifer M. Groh"'
Search Results
2. Hearing in a world of light: why, where, and how visual and auditory information are connected by the brain
- Author
-
Jennifer M. Groh
- Subjects
Hearing ,embodied cognition ,interaction between vision and hearing ,EMREO ,multiplexing ,coordinate transformation ,Human anatomy ,QM1-695 - Abstract
Keynote by Jenny Groh (Duke University) at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019 Video stream: https://vimeo.com/356576513 Abstract: Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote presentation describes recent findings concerning how eye movements affect early auditory processing via motor processes in the ear (eye movement-related eardrum oscillations, or EMREOs). Computational efforts to understand how eye movements are factored in to auditory processing to produce a reference frame aligned with visual space uncovered a second critical issue: sound location is not mapped but is instead rate (meter) coded in the primate brain, unlike visual space. Meter coding would appear to limit the representation of multiple simultaneous sounds. The second part of this presentation concerns how such a meter code could use fluctuating activity patterns to circumvent this limitation
- Published
- 2019
- Full Text
- View/download PDF
3. Multiple objects evoke fluctuating responses in several regions of the visual pathway
- Author
-
Meredith N Schmehl, Valeria C Caruso, Yunran Chen, Na Young Jun, Shawn M Willett, Jeff T Mohl, Douglas A Ruff, Marlene Cohen, Akinori F Ebihara, Winrich A Freiwald, Surya T Tokdar, and Jennifer M Groh
- Subjects
neural representation ,neural code ,multiplexing ,object vision ,figure ground segregation ,visual system ,Medicine ,Science ,Biology (General) ,QH301-705.5 - Abstract
How neural representations preserve information about multiple stimuli is mysterious. Because tuning of individual neurons is coarse (e.g., visual receptive field diameters can exceed perceptual resolution), the populations of neurons potentially responsive to each individual stimulus can overlap, raising the question of how information about each item might be segregated and preserved in the population. We recently reported evidence for a potential solution to this problem: when two stimuli were present, some neurons in the macaque visual cortical areas V1 and V4 exhibited fluctuating firing patterns, as if they responded to only one individual stimulus at a time (Jun et al., 2022). However, whether such an information encoding strategy is ubiquitous in the visual pathway and thus could constitute a general phenomenon remains unknown. Here, we provide new evidence that such fluctuating activity is also evoked by multiple stimuli in visual areas responsible for processing visual motion (middle temporal visual area, MT), and faces (middle fundus and anterolateral face patches in inferotemporal cortex – areas MF and AL), thus extending the scope of circumstances in which fluctuating activity is observed. Furthermore, consistent with our previous results in the early visual area V1, MT exhibits fluctuations between the representations of two stimuli when these form distinguishable objects but not when they fuse into one perceived object, suggesting that fluctuating activity patterns may underlie visual object formation. Taken together, these findings point toward an updated model of how the brain preserves sensory information about multiple stimuli for subsequent processing and behavioral action.
- Published
- 2024
- Full Text
- View/download PDF
4. Coordinated multiplexing of information about separate objects in visual cortex
- Author
-
Na Young Jun, Douglas A Ruff, Lily E Kramer, Brittany Bowes, Surya T Tokdar, Marlene R Cohen, and Jennifer M Groh
- Subjects
noise correlations ,variability ,multiplexing ,population coding ,object vision ,Medicine ,Science ,Biology (General) ,QH301-705.5 - Abstract
Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information about each of the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here, we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count (‘noise’) correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.
- Published
- 2022
- Full Text
- View/download PDF
5. Individual similarities and differences in eye-movement-related eardrum oscillations (EMREOs)
- Author
-
Cynthia D King, Stephanie N Lovich, David LK Murphy, Rachel Landrum, David Kaylie, Christopher A Shera, and Jennifer M Groh
- Subjects
Article - Abstract
A unique type of low-frequency otoacoustic emission (OAE) time-locked to the onset (and offset) of saccadic eye movements was recently discovered in our laboratory (Gruters et al., 2018). The specific underlying mechanisms that generate these eye-movement-related eardrum oscillations (termed EMREOs) and their possible role in auditory perception are unknown. Clues to both the drivers of EMREOs and their purpose can be gleaned by examining responses in normal hearing human subjects. Do EMREOs occur in all individuals with normal hearing? If so, what components of the response occur most consistently? Understanding which attributes of EMREOs are similar across participants and which show more variability will provide the groundwork for future comparisons with individuals with hearing abnormalities affecting the ear’s various motor components. Here we report that in subjects with normal hearing thresholds and normal middle ear function, all ears show (a) measurable EMREOs, (b) a phase reversal for contra-versus ipsilaterally-directed saccades, (c) a large peak in the signal occurring soon after saccade onset, (d) an additional large peak time-locked to saccade offset and (e) evidence that saccade duration is encoded in the signal. We interpret the attributes of EMREOs that are most consistent across subjects as the ones that are most likely to play an essential role in their function. The individual differences likely reflect normal variation in individuals’ auditory system anatomy and physiology, much like traditional measures of auditory function such as auditory-evoked OAEs, tympanometry and auditory-evoked potentials. Future work will compare subjects with different types of auditory dysfunction to population data from normal hearing subjects.
- Published
- 2023
6. Visual Signals in the Mammalian Auditory System
- Author
-
Meredith N. Schmehl and Jennifer M. Groh
- Subjects
Auditory Cortex ,Mammals ,Inferior colliculus ,genetic structures ,Sensory processing ,Computer science ,medicine.medical_treatment ,Sensation ,Sense Organs ,Multisensory integration ,Sensory system ,Auditory cortex ,eye diseases ,Ophthalmology ,medicine.anatomical_structure ,Saccade ,Visual Perception ,medicine ,Biological neural network ,Animals ,Auditory system ,Visual Pathways ,Neurology (clinical) ,Neuroscience - Abstract
Coordination between different sensory systems is a necessary element of sensory processing. Where and how signals from different sense organs converge onto common neural circuitry have become topics of increasing interest in recent years. In this article, we focus specifically on visual–auditory interactions in areas of the mammalian brain that are commonly considered to be auditory in function. The auditory cortex and inferior colliculus are two key points of entry where visual signals reach the auditory pathway, and both contain visual- and/or eye movement–related signals in humans and other animals. The visual signals observed in these auditory structures reflect a mixture of visual modulation of auditory-evoked activity and visually driven responses that are selective for stimulus location or features. These key response attributes also appear in the classic visual pathway but may play a different role in the auditory pathway: to modify auditory rather than visual perception. Finally, while this review focuses on two particular areas of the auditory pathway where this question has been studied, robust descending as well as ascending connections within this pathway suggest that undiscovered visual signals may be present at other stages as well.
- Published
- 2021
- Full Text
- View/download PDF
7. Coordinated multiplexing of information about separate objects in visual cortex
- Author
-
Brittany S. Bowes, Kramer Le, Marlene R. Cohen, Douglas A. Ruff, Surya T. Tokdar, Jennifer M. Groh, and Na Young Jun
- Subjects
genetic structures ,Sensory system ,Biology ,Stimulus (physiology) ,Multiplexing ,General Biochemistry, Genetics and Molecular Biology ,Correlation ,03 medical and health sciences ,0302 clinical medicine ,Time-division multiplexing ,medicine ,Animals ,Visual Cortex ,030304 developmental biology ,Neurons ,0303 health sciences ,General Immunology and Microbiology ,General Neuroscience ,Brain ,General Medicine ,Visual cortex ,medicine.anatomical_structure ,Receptive field ,Macaca ,Neuroscience ,030217 neurology & neurosurgery ,Coding (social sciences) - Abstract
Sensory receptive fields are large enough that they can contain more than one perceptible stimulus. How, then, can the brain encode information abouteachof the stimuli that may be present at a given moment? We recently showed that when more than one stimulus is present, single neurons can fluctuate between coding one vs. the other(s) across some time period, suggesting a form of neural multiplexing of different stimuli (Caruso et al., 2018). Here we investigate (a) whether such coding fluctuations occur in early visual cortical areas; (b) how coding fluctuations are coordinated across the neural population; and (c) how coordinated coding fluctuations depend on the parsing of stimuli into separate vs. fused objects. We found coding fluctuations do occur in macaque V1 but only when the two stimuli form separate objects. Such separate objects evoked a novel pattern of V1 spike count (“noise”) correlations involving distinct distributions of positive and negative values. This bimodal correlation pattern was most pronounced among pairs of neurons showing the strongest evidence for coding fluctuations or multiplexing. Whether a given pair of neurons exhibited positive or negative correlations depended on whether the two neurons both responded better to the same object or had different object preferences. Distinct distributions of spike count correlations based on stimulus preferences were also seen in V4 for separate objects but not when two stimuli fused to form one object. These findings suggest multiple objects evoke different response dynamics than those evoked by single stimuli, lending support to the multiplexing hypothesis and suggesting a means by which information about multiple objects can be preserved despite the apparent coarseness of sensory coding.Significance StatementHow the brain separates information about multiple objects despite overlap in the neurons responsive to each item is not well understood. Here we show that some neurons in V1 exhibit coding fluctuations in response to two objects, and that these coding fluctuations are coordinated at the population level in ways that are not observed for single objects. Broadly similar results were obtained in V4. These response dynamics lend support to the hypothesis that information about individual objects may be multiplexed across the neural population, preserving information about each item despite the coarseness of sensory coding.
- Published
- 2022
- Full Text
- View/download PDF
8. Parametric information about eye movements is sent to the ears
- Author
-
Stephanie N Lovich, Cynthia D King, David LK Murphy, Rachel Landrum, Christopher A Shera, and Jennifer M Groh
- Abstract
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect neural pathways from the ear through auditory cortex and beyond, but how these signals contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in eye movement-related eardrum oscillations (EMREOs), pressure changes recorded in the ear canal that occur in conjunction with simultaneous eye movements. We show that EMREOs contain parametric information about horizontal and vertical eye displacement as well as initial/final eye position with respect to the head. The parametric information in the horizontal and vertical directions combines linearly, allowing accurate prediction of the EMREOs associated with oblique eye movements from their respective horizontal and vertical components. Target location can also be inferred from the EMREO signals recorded during eye movements to those targets. We hypothesize that the thus-far unknown mechanism underlying EMREOs could impose a two-dimensional eye-movement related transfer function on any incoming sound, permitting subsequent processing stages to compute the positions of sounds in relation to the visual scene.
- Published
- 2022
- Full Text
- View/download PDF
9. Author response: Coordinated multiplexing of information about separate objects in visual cortex
- Author
-
Na Young Jun, Douglas A Ruff, Lily E Kramer, Brittany Bowes, Surya T Tokdar, Marlene R Cohen, and Jennifer M Groh
- Published
- 2022
- Full Text
- View/download PDF
10. Monkeys and humans implement causal inference to simultaneously localize auditory and visual stimuli
- Author
-
John M. Pearson, Jennifer M. Groh, and Jeff T. Mohl
- Subjects
Adult ,Male ,Visual perception ,Computer science ,Physiology ,media_common.quotation_subject ,Sensory system ,Task (project management) ,Thinking ,03 medical and health sciences ,0302 clinical medicine ,Perception ,Saccades ,Animals ,Humans ,Sound Localization ,Eye-Tracking Technology ,media_common ,030304 developmental biology ,0303 health sciences ,General Neuroscience ,Eye movement ,Saccadic masking ,Behavioral modeling ,Space Perception ,Causal inference ,Saccade ,Auditory Perception ,Visual Perception ,Female ,Psychology ,Neuroscience ,030217 neurology & neurosurgery ,Cognitive psychology ,Research Article - Abstract
The environment is sampled by multiple senses, which are woven together to produce a unified perceptual state. However, optimally unifying such signals requires assigning particular signals to the same or different underlying objects or events. Many prior studies (especially in animals) have assumed fusion of cross-modal information, whereas recent work in humans has begun to probe the appropriateness of this assumption. Here we present results from a novel behavioral task in which both monkeys and humans localized visual and auditory stimuli and reported their perceived sources through saccadic eye movements. When the locations of visual and auditory stimuli were widely separated, subjects made two saccades, while when the two stimuli were presented at the same location they made only a single saccade. Intermediate levels of separation produced mixed response patterns: a single saccade to an intermediate position on some trials or separate saccades to both locations on others. The distribution of responses was well described by a hierarchical causal inference model that accurately predicted both the explicit “same vs. different” source judgements as well as biases in localization of the source(s) under each of these conditions. The results from this task are broadly consistent with prior work in humans across a wide variety of analogous tasks, extending the study of multisensory causal inference to non-human primates and to a natural behavioral task with both a categorical assay of the number of perceived sources and a continuous report of the perceived position of the stimuli.Author SummaryWe experience the world through multiple sensory systems, which interact to shape perception. To do so, the brain must first determine which pieces of sensory input arise from the same source and which have nothing to do with one another. To probe how the brain accomplishes this causal inference, we developed a naturalistic paradigm that provides a behavioral report both of the number of perceived stimuli and their locations. We tested performance on this task in both humans and monkeys, and we found that both species perform causal inference in a similar manner. By providing this cross-species comparison at the behavioral level, our paradigm lays the groundwork for future experiments using neuronal recording techniques that may be impractical or impossible in human subjects.
- Published
- 2020
- Full Text
- View/download PDF
11. Making Space: How the Brain Knows Where Things Are
- Author
-
Jennifer M. Groh
- Published
- 2014
12. Converting neural signals from place codes to rate codes.
- Author
-
Jennifer M. Groh
- Published
- 2001
- Full Text
- View/download PDF
13. Editor's evaluation: Microsaccades as a marker not a cause for attention-related modulation
- Author
-
Jennifer M Groh
- Published
- 2021
- Full Text
- View/download PDF
14. Author response for 'Multiple sounds degrade the frequency representation in monkey inferior colliculus'
- Author
-
null Shawn M. Willett and null Jennifer M. Groh
- Published
- 2021
- Full Text
- View/download PDF
15. Editor's evaluation: Corticofugal regulation of predictive coding
- Author
-
Jennifer M Groh
- Published
- 2021
- Full Text
- View/download PDF
16. Decision letter: Corticofugal regulation of predictive coding
- Author
-
Jennifer M Groh
- Published
- 2021
- Full Text
- View/download PDF
17. Author response for 'Multiple sounds degrade the frequency representation in monkey inferior colliculus'
- Author
-
Jennifer M. Groh and Shawn M. Willett
- Subjects
Inferior colliculus ,Computer science ,Speech recognition ,Representation (systemics) - Published
- 2021
- Full Text
- View/download PDF
18. Analyzing second order stochasticity of neural spiking under stimuli-bundle exposure
- Author
-
Surya T. Tokdar, Jennifer M. Groh, Azeem Zaman, Valeria C. Caruso, Chris Glynn, Shawn M. Willett, and Jeff T. Mohl
- Subjects
FOS: Computer and information sciences ,Statistics and Probability ,Quantitative Biology::Neurons and Cognition ,Computer science ,Stochastic process ,Spike train ,Stimulus (physiology) ,Bayesian inference ,Statistics - Applications ,Article ,Synthetic data ,Point process ,symbols.namesake ,Modeling and Simulation ,Bundle ,symbols ,Applications (stat.AP) ,Statistics, Probability and Uncertainty ,Biological system ,Gaussian process - Abstract
Conventional analysis of neuroscience data involves computing average neural activity over a group of trials and/or a period of time. This approach may be particularly problematic when assessing the response patterns of neurons to more than one simultaneously presented stimulus. In such cases, the brain must represent each individual component of the stimuli bundle, but trial-and-time-pooled averaging methods are fundamentally unequipped to address the means by which multi-item representation occurs. We introduce and investigate a novel statistical analysis framework that relates the firing pattern of a single cell, exposed to a stimuli bundle, to the ensemble of its firing patterns under each constituent stimulus. Existing statistical tools focus on what may be called "first order stochasticity" in trial-to-trial variation in the form of unstructured noise around a fixed firing rate curve associated with a given stimulus. Our analysis is based upon the theoretical premise that exposure to a stimuli bundle induces additional stochasticity in the cell's response pattern, in the form of a stochastically varying recombination of its single stimulus firing rate curves. We discuss challenges to statistical estimation of such "second order stochasticity" and address them with a novel dynamic admixture Poisson process (DAPP) model. DAPP is a hierarchical point process model that decomposes second order stochasticity into a Gaussian stochastic process and a random vector of interpretable features, and, facilitates borrowing of information on the latter across repeated trials through latent clustering. We present empirical evidence of the utility of the DAPP analysis with synthetic and real neural recordings., Comment: 26 pages, 7 figures
- Published
- 2021
- Full Text
- View/download PDF
19. Editor's evaluation: Distinct higher-order representations of natural sounds in human and ferret auditory cortex
- Author
-
Jennifer M. Groh
- Subjects
Computer science ,Order (business) ,Speech recognition ,Auditory cortex ,Natural sounds - Published
- 2021
- Full Text
- View/download PDF
20. Two models for transforming auditory signals from head-centered to eye-centered coordinates.
- Author
-
Jennifer M. Groh and D. L. Sparks
- Published
- 1992
- Full Text
- View/download PDF
21. Evidence for a system in the auditory periphery that may contribute to linking sounds and images in space
- Author
-
David LK Murphy, Cynthia D King, Stephanie N Lovich, Rachel E Landrum, Christopher A Shera, and Jennifer M Groh
- Subjects
Audio signal ,genetic structures ,business.industry ,Computer science ,Eye movement ,Space (commercial competition) ,Auditory cortex ,Signal ,eye diseases ,medicine.anatomical_structure ,medicine ,Auditory pathways ,Computer vision ,Artificial intelligence ,business ,Eardrum ,Reference frame - Abstract
Eye movements alter the relationship between the visual and auditory spatial scenes. Signals related to eye movements affect the brain’s auditory pathways from the ear through auditory cortex and beyond, but how these signals might contribute to computing the locations of sounds with respect to the visual scene is poorly understood. Here, we evaluated the information contained in the signals observed at the earliest processing stage, eye movement-related eardrum oscillations (EMREOs). We report that human EMREOs carry information about both horizontal and vertical eye displacement as well as initial/final eye position. We conclude that all of the information necessary to contribute to a suitable coordinate transformation of auditory spatial cues into a common reference frame with visual information is present in this signal. We hypothesize that the underlying mechanism causing EMREOs could impose a transfer function on any incoming sound signal, which could permit subsequent processing stages to compute the positions of sounds in relation to the visual scene.
- Published
- 2020
- Full Text
- View/download PDF
22. Multiple sounds degrade the frequency representation in monkey inferior colliculus
- Author
-
Jennifer M. Groh and Shawn M. Willett
- Subjects
Sound (medical instrument) ,Inferior colliculus ,Frequency response ,Frequency selectivity ,Non human primate ,Computer science ,General Neuroscience ,Representation (systemics) ,Stimulus (physiology) ,Macaca mulatta ,Inferior Colliculi ,Neural activity ,Sound ,medicine.anatomical_structure ,Acoustic Stimulation ,Receptive field ,otorhinolaryngologic diseases ,medicine ,Animals ,Sound Localization ,Neuron ,Representation (mathematics) ,Neural coding ,Neuroscience - Abstract
How we distinguish multiple simultaneous stimuli is uncertain, particularly given that such stimuli sometimes recruit largely overlapping populations of neurons. One commonly proposed hypothesis is that the sharpness of tuning curves might change to limit the number of stimuli driving any given neuron when multiple stimuli are present. To test this hypothesis, we recorded the activity of neurons in the inferior colliculus while monkeys made saccades to either one or two simultaneous sounds differing in frequency and spatial location. Although monkeys easily distinguished simultaneous sounds (∼90% correct performance), the frequency selectivity of inferior colliculus neurons on dual sound trials did not improve in any obvious way. Frequency selectivity was degraded on dual sound trials compared to single sound trials: neural response functions broadened, and frequency accounted for less of the variance in firing rate. These changes in neural firing led a maximum-likelihood decoder to perform worse on dual sound trials than on single sound trials. These results fail to support the hypothesis that changes in frequency response functions serve to reduce the overlap in the representation of simultaneous sounds. Instead, these results suggest that alternative possibilities, such as recent evidence of alternations in firing rate between the rates corresponding to each of the two stimuli, offer a more promising approach.Graphic AbstractHow sensory representations encode multiple stimuli despite coarse coding is unknown. Using a maximum likelihood decoder operating on the spike count response patterns of monkey inferior colliculus neurons, we show a marked reduction in decoding accuracy when two sounds are presented compared to one. The decoding was inferior to the behavioral performance of the animals, and thus suggests the presence of alternative coding strategies.
- Published
- 2020
- Full Text
- View/download PDF
23. Sounds and beyond: multisensory and other non-auditory signals in the inferior colliculus
- Author
-
Kurtis G Gruters and Jennifer M Groh
- Subjects
Communication ,Sound Localization ,auditory ,multisensory ,inferior colliculus ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
The inferior colliculus (IC) is a major processing center situated mid-way along both the ascending and descending auditory pathways of the brain stem. Although it is fundamentally an auditory area, the IC also receives anatomical input from non-auditory sources. Neurophysiological studies corroborate that non-auditory stimuli can modulate auditory processing in the IC and even elicit responses independent of coincident auditory stimulation. In this article, we review anatomical and physiological evidence for multisensory and other non-auditory processing in the IC. Specifically, the contributions of signals related to vision, eye movements and position, somatosensation, and behavioral context to neural activity in the IC will be described. These signals are potentially important for localizing sound sources, attending to salient stimuli, distinguishing environmental from self-generated sounds, and perceiving and generating communication sounds. They suggest that the IC should be thought of as a node in a highly interconnected sensory, motor, and cognitive network dedicated to synthesizing a higher-order auditory percept rather than simply reporting patterns of air pressure detected by the cochlea. We highlight some of the potential pitfalls that can arise from experimental manipulations that may disrupt the normal function of this network, such as the use of anesthesia or the severing of connections from cortical structures that project to the IC. Finally, we note that the presence of these signals in the IC has implications for our understanding not just of the IC but also of the multitude of other regions within and beyond the auditory system that are dependent on signals that pass through the IC. Whatever the IC hears would seem to be passed both upward to thalamus and thence to auditory cortex and beyond, as well as downward via centrifugal connections to earlier areas of the auditory pathway such as the cochlear nucleus.
- Published
- 2012
- Full Text
- View/download PDF
24. Distribution of visual and saccade related information in the monkey inferior colliculus
- Author
-
David A Bulkin and Jennifer M Groh
- Subjects
auditory ,multisensory ,Vision ,cross-modal ,topographic maps ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
The inferior colliculus (IC) is an essential stop early in the ascending auditory pathway. Though normally thought of as a predominantly auditory structure, recent work has uncovered a variety of non-auditory influences on firing rate in the IC. Here, we map the location within the IC of neurons that respond to the onset of a fixation-guiding visual stimulus. Visual/visuomotor associated activity was found throughout the IC (overall, 84 of 199 sites tested or 42%), but with a far reduced prevalence and strength along recording penetrations passing through the tonotopically organized region of the IC, putatively the central nucleus (11 of 42 sites tested, or 26%). These results suggest that visual information has only a weak effect on early auditory processing in core regions, but more strongly targets the modulatory shell regions of the IC.
- Published
- 2012
- Full Text
- View/download PDF
25. Effects of initial eye position on saccades evoked by microstimulation in the primate superior colliculus: implications for models of the SC read-out process
- Author
-
Jennifer M Groh
- Subjects
monkey ,oculomotor ,superior colliculus ,Eye position ,pulse-step generator ,reference frame ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 ,Neurology. Diseases of the nervous system ,RC346-429 - Abstract
The motor layers of the superior colliculus (SC) are thought to specify saccade amplitude and direction, independent of initial eye position. However, recent evidence suggests that eye position can modulate the level of activity of SC motor neurons. In this study, we tested whether initial eye position has an effect on microstimulation-evoked saccade amplitude. High (>300 Hz) and low (
- Published
- 2011
- Full Text
- View/download PDF
26. Comparison of gain-like properties of eye position signals in inferior colliculus versus auditory cortex of primates
- Author
-
Joost X Maier and Jennifer M Groh
- Subjects
Auditory Cortex ,inferior colliculus ,Primate ,Eye position ,Gain field ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 ,Neurology. Diseases of the nervous system ,RC346-429 - Abstract
We evaluated to what extent the influence of eye position in the auditory pathway of primates can be described as a gain field. We compared single unit activity in the inferior colliculus (IC), core auditory cortex (A1) and the caudomedial belt (CM) region of auditory cortex in primates, and found stronger evidence for gain field-like interactions in the IC than in auditory cortex. In the inferior colliculus, eye position signals showed both multiplicative and additive interactions with auditory responses, whereas in auditory cortex the effects were not as well predicted by a gain field model.
- Published
- 2010
- Full Text
- View/download PDF
27. Space for Thought
- Author
-
Jennifer M. Groh
- Subjects
Theoretical physics ,Computer science ,Space (mathematics) - Abstract
This Reflection concerns how the brain represents space and how such spatial representations may relate to our cognitive abilities. Space is central to how the brain encodes information, whether it concerns what we see, hear, or feel or how we move through our environment. Two different kinds of spatial signals have been observed in the brain: maps, in which different neurons are responsive to different locations of external stimuli, and meters, in which neurons are sensitive to a broad range of locations but can signal the position of a stimulus via an overall level of activity. These spatial codes may be recruited in the brain not only for processing the immediate spatial environment but also for thought and language. Evidence for this view comes from patterns of spatial sensory and motor metaphors in language and from brain-imaging studies suggesting a relationship between the neural substrates for language and those deployed for sensory and motor processing. Such parallels in functionality may have emerged in an evolutionary process of duplicating the brain’s primary sensory and motor areas and repurposing them for new tasks, i.e. our cognitive abilities.
- Published
- 2020
- Full Text
- View/download PDF
28. Hemisphere-specific properties of the ventriloquism aftereffect
- Author
-
Barbara G. Shinn-Cunningham, Norbert Kopčo, Jennifer M. Groh, I-Fan Lin, and Peter Lokša
- Subjects
0303 health sciences ,Acoustics and Ultrasonics ,Eye Movements ,Computer science ,Speech recognition ,Brain ,Stimulus (physiology) ,Illusions ,Functional Laterality ,Jasa Express Letters ,03 medical and health sciences ,Young Adult ,0302 clinical medicine ,Arts and Humanities (miscellaneous) ,Figural Aftereffect ,Modulation (music) ,Speech Perception ,Humans ,Sound Localization ,Cues ,030217 neurology & neurosurgery ,030304 developmental biology ,Reference frame - Abstract
Visual calibration of auditory space requires re-alignment of representations differing in (1) format (auditory hemispheric channels vs visual maps) and (2) reference frames (head-centered vs eye-centered). Here, a ventriloquism paradigm from Kopco, Lin, Shinn-Cunningham, and Groh [J. Neurosci. 29, 13809–13814 (2009)] was used to examine these processes in humans for ventriloquism induced within one spatial hemifield. Results show that (1) the auditory representation can be adapted even by aligned audio-visual stimuli, and (2) the spatial reference frame is primarily head-centered, with a weak eye-centered modulation. These results support the view that the ventriloquism aftereffect is driven by multiple spatially non-uniform, hemisphere-specific processes.Visual calibration of auditory space requires re-alignment of representations differing in (1) format (auditory hemispheric channels vs visual maps) and (2) reference frames (head-centered vs eye-centered). Here, a ventriloquism paradigm from Kopco, Lin, Shinn-Cunningham, and Groh [J. Neurosci. 29, 13809–13814 (2009)] was used to examine these processes in humans for ventriloquism induced within one spatial hemifield. Results show that (1) the auditory representation can be adapted even by aligned audio-visual stimuli, and (2) the spatial reference frame is primarily head-centered, with a weak eye-centered modulation. These results support the view that the ventriloquism aftereffect is driven by multiple spatially non-uniform, hemisphere-specific processes.
- Published
- 2019
29. Sensitivity and specificity of a Bayesian single trial analysis for time varying neural signals
- Author
-
Jeff T. Mohl, Surya T. Tokdar, Jennifer M. Groh, and Valeria C. Caruso
- Subjects
0303 health sciences ,business.industry ,Bayesian probability ,Single stimulus ,Pattern recognition ,Stimulus (physiology) ,Correct response ,Article ,Synthetic data ,03 medical and health sciences ,Neural activity ,0302 clinical medicine ,Categorization ,Quantitative Biology - Neurons and Cognition ,FOS: Biological sciences ,Neurons and Cognition (q-bio.NC) ,Artificial intelligence ,Single trial ,business ,030217 neurology & neurosurgery ,030304 developmental biology ,Mathematics - Abstract
We recently reported the existence of fluctuations in neural signals that may permit neurons to code multiple simultaneous stimuli sequentially across time. This required deploying a novel statistical approach to permit investigation of neural activity at the scale of individual trials. Here we present tests using synthetic data to assess the sensitivity and specificity of this analysis. We fabricated datasets to match each of several potential response patterns derived from single-stimulus response distributions. In particular, we simulated dual stimulus trial spike counts that reflected fluctuating mixtures of the single stimulus spike counts, stable intermediate averages, single stimulus winner-take-all, or response distributions that were outside the range defined by the single stimulus responses (such as summation or suppression). We then assessed how well the analysis recovered the correct response pattern as a function of the number of simulated trials and the difference between the simulated responses to each "stimulus" alone. We found excellent recovery of the mixture, intermediate, and outside categories (>97% percent correct), and good recovery of the single/winner-take-all category (>90% correct) when the number of trials was >20 and the single-stimulus response rates were 50Hz and 20Hz respectively. Both larger numbers of trials and greater separation between the single stimulus firing rates improved categorization accuracy. These results provide a benchmark, and guidelines for data collection, for use of this method to investigate coding of multiple items at the individual-trial time scale., Accepted for publication in Neurons, Behavior, Data analysis, and Theory
- Published
- 2019
- Full Text
- View/download PDF
30. Compensating for a shifting world: evolving reference frames of visual and auditory signals across three multimodal brain areas
- Author
-
Daniel S. Pages, Valeria C. Caruso, Jennifer M. Groh, and Marc A. Sommer
- Subjects
Superior Colliculi ,Time Factors ,genetic structures ,Physiology ,Computer science ,Posterior parietal cortex ,Sensory system ,Stimulus (physiology) ,Biology ,03 medical and health sciences ,0302 clinical medicine ,Stimulus modality ,Parietal Lobe ,Saccades ,Animals ,030304 developmental biology ,0303 health sciences ,General Neuroscience ,Superior colliculus ,Frontal eye fields ,Macaca mulatta ,eye diseases ,Frontal Lobe ,Acoustic Stimulation ,Receptive field ,Saccade ,Auditory Perception ,Visual Perception ,Neuroscience ,Photic Stimulation ,030217 neurology & neurosurgery ,Research Article - Abstract
Stimulus locations are detected differently by different sensory systems, but ultimately they yield similar percepts and behavioral responses. How the brain transcends initial differences to compute similar codes is unclear. We quantitatively compared the reference frames of two sensory modalities, vision and audition, across three interconnected brain areas involved in generating saccades, namely the frontal eye fields (FEF), lateral and medial parietal cortex (M/LIP), and superior colliculus (SC). We recorded from single neurons in head-restrained monkeys performing auditory- and visually-guided saccades from variable initial fixation locations, and evaluated whether their receptive fields were better described as eye-centered, head-centered, or hybrid (i.e. not anchored uniquely to head- or eye-orientation). We found a progression of reference frames across areas and across time, with considerable hybrid-ness and persistent differences between modalities during most epochs/brain regions. For both modalities, the SC was more eye-centered than the FEF, which in turn was more eye-centered than the predominantly hybrid M/LIP. In all three areas and temporal epochs from stimulus onset to movement, visual signals were more eye-centered than auditory signals. In the SC and FEF, auditory signals became more eye-centered at the time of the saccade than they were initially after stimulus onset, but only in the SC at the time of the saccade did the auditory signals become predominantly eye-centered. The results indicate that visual and auditory signals both undergo transformations, ultimately reaching the same final reference frame but via different dynamics across brain regions and time.New and NoteworthyModels for visual-auditory integration posit that visual signals are eye-centered throughout the brain, while auditory signals are converted from head-centered to eye-centered coordinates. We show instead that both modalities largely employ hybrid reference frames: neither fully head-nor eye-centered. Across three hubs of the oculomotor network (intraparietal cortex, frontal eye field and superior colliculus) visual and auditory signals evolve from hybrid to a common eye-centered format via different dynamics across brain areas and time.
- Published
- 2019
- Full Text
- View/download PDF
31. Hemisphere-Specific Properties of the Ventriloquism Aftereffect in Humans and Monkeys
- Author
-
Peter Lokša, Jennifer M. Groh, Barbara G. Shinn-Cunningham, I-Fan Lin, and Norbert Kopčo
- Subjects
0303 health sciences ,business.industry ,Calibration (statistics) ,Computer science ,Representation (systemics) ,Space (commercial competition) ,Stimulus (physiology) ,03 medical and health sciences ,0302 clinical medicine ,Computer vision ,Artificial intelligence ,business ,030217 neurology & neurosurgery ,030304 developmental biology ,Reference frame - Abstract
Visual calibration of auditory space requires re-alignment of representations differing in 1) format (auditory hemispheric channels vs. visual maps) and 2) reference frames (head-centered vs. eye-centered). Here, a ventriloquism paradigm from Kopčo et al. (J Neurosci, 29, 13809-13814) was used to examine these processes in humans and monkeys for ventriloquism induced within one spatial hemifield. Results show that 1) the auditory representation is adapted even by aligned audio-visual stimuli, and 2) the spatial reference frame is primarily head-centered in humans but mixed in monkeys. These results support the view that the ventriloquism aftereffect is driven by multiple spatially non-uniform processes.PACS numbers: 43.66.Pn, 43.66.Qp, 43.66.Mk
- Published
- 2019
- Full Text
- View/download PDF
32. Decision letter: Population rate-coding predicts correctly that human sound localization depends on sound intensity
- Author
-
Jennifer M. Groh, Heather Read, and Catherine E. Carr
- Subjects
Sound localization ,education.field_of_study ,Computer science ,Acoustics ,Population ,education ,Neural coding ,Sound intensity - Published
- 2019
- Full Text
- View/download PDF
33. Functional properties of circuits, cellular populations, and areas
- Author
-
Kenneth D. Harris, Jennifer M. Groh, James DiCarlo, Pascal Fries, Matthias Kaschube, Gilles Laurent, Jason N. MacLean, David A. McCormick, Gordon Pipa, John H. Reynolds, Andrew B. Schwartz, Terrence J. Sejnowski, Wolf Singer, and Martin Vinck
- Abstract
A central goal of systems neuroscience is to understand how the brain represents and processes information to guide behavior (broadly defined as encompassing perception, cognition, and observable outcomes of those mental states through action). These concepts have been central to research in this field for at least sixty years, and research efforts have taken a variety of approaches. At this Forum, our discussions focused on what is meant by “functional” and “inter-areal,” what new concepts have emerged over the last several decades, and how we need to update and refresh these concepts and approaches for the coming decade.In this chapter, we consider some of the historical conceptual frameworks that have shaped consideration of neural coding and brain function, with an eye toward what aspects have held up well, what aspects need to be revised, and what new concepts may foster future work.Conceptual frameworks need to be revised periodically lest they become counterproductive and actually blind us to the significance of novel discoveries. Take, for example, hippocampal place cells: their accidental discovery led to the generation of new conceptual frameworks linking phenomena (e.g., memory, spatial navigation, and sleep) that previously seemed disparate, revealing unimagined mechanistic connections. Progress in scientific understanding requires an iterative loop from experiment to model/theory and back. Without such periodic reassessment, fields of scientific inquiry risk becoming bogged down by the propagation of outdated frameworks, often across multiple generations of researchers. This not only limits the impact of the truly new and unexpected, it hinders the pace of progress.
- Published
- 2019
34. Hearing in a 'Moving' Visual World: Coordinate Transformations Along the Auditory Pathway
- Author
-
Shawn M. Willett, Jennifer M. Groh, and Ross K. Maddox
- Subjects
Sound localization ,Inferior colliculus ,genetic structures ,business.industry ,Computer science ,Coordinate system ,Multisensory integration ,eye diseases ,Sight ,Position (vector) ,Eye tracking ,Computer vision ,sense organs ,Artificial intelligence ,business ,Reference frame - Abstract
This chapter reviews the literature on how auditory signals are transformed into a coordinate system that facilitates interactions with the visual system. Sound location is deduced from cues that depend on the position of the sound with respect to the head, but visual location is deduced from the pattern of light illuminating the retina, yielding an eye-centered code. Connecting sights and sounds originating from the same position in the physical world requires the brain to incorporate information about the position of the eyes with respect to the head. Eye position has been found to interact with auditory signals at all levels of the auditory pathway that have been tested but usually yields a code that is in a hybrid reference frame: neither head nor eye centered. Computing a coordinate transformation, in principle, may be easy, which could suggest that the looseness of the computational constraints may permit hybrid coding. A review of the behavioral literature addressing the effects of eye gaze on auditory spatial perception and a discussion of its consistency with physiological observations concludes the chapter.
- Published
- 2019
- Full Text
- View/download PDF
35. Beyond the labeled line: variation in visual reference frames from intraparietal cortex to frontal eye fields and the superior colliculus
- Author
-
Daniel S. Pages, Jennifer M. Groh, Valeria C. Caruso, and Marc A. Sommer
- Subjects
0301 basic medicine ,Supplementary eye field ,Superior Colliculi ,genetic structures ,Physiology ,media_common.quotation_subject ,Biology ,050105 experimental psychology ,03 medical and health sciences ,0302 clinical medicine ,Parietal Lobe ,Cortex (anatomy) ,Saccades ,medicine ,Animals ,Contrast (vision) ,0501 psychology and cognitive sciences ,Computer vision ,media_common ,Mathematics ,Retina ,business.industry ,General Neuroscience ,Superior colliculus ,05 social sciences ,Electroencephalography ,Frontal eye fields ,Macaca mulatta ,eye diseases ,Electrophysiological Phenomena ,Frontal Lobe ,030104 developmental biology ,medicine.anatomical_structure ,Receptive field ,Fixation (visual) ,Visual Perception ,Artificial intelligence ,sense organs ,business ,Neuroscience ,030217 neurology & neurosurgery ,Research Article ,Reference frame - Abstract
We accurately perceive the visual scene despite moving our eyes ~3 times per second, an ability that requires incorporation of eye position and retinal information. We assessed how this neural computation unfolds across three interconnected structures: frontal eye fields (FEF), intraparietal cortex (LIP/MIP), and the superior colliculus (SC). Single unit activity was assessed in head-restrained monkeys performing visually-guided saccades from different initial fixations. As previously shown, the receptive fields of most LIP/MIP neurons shifted to novel positions on the retina for each eye position, and these locations were not clearly related to each other in either eye- or head-centered coordinates (hybrid coordinates). In contrast, the receptive fields of most SC neurons were stable in eye-centered coordinates. In FEF, visual signals were intermediate between those patterns: around 60% were eye-centered, whereas the remainder showed changes in receptive field location, boundaries, or responsiveness that rendered the response patterns hybrid or occasionally head-centered. These results suggest that FEF may act as a transitional step in an evolution of coordinates between LIP/MIP and SC. The persistence across cortical areas of hybrid representations that do not provide unequivocal location labels in a consistent reference frame has implications for how these representations must be read-out.New & NoteworthyHow we perceive the world as stable using mobile retinas is poorly understood. We compared the stability of visual receptive fields across different fixation positions in three visuomotor regions. Irregular changes in receptive field position were ubiquitous in intraparietal cortex, evident but less common in the frontal eye fields, and negligible in the superior colliculus (SC), where receptive fields shifted reliably across fixations. Only the SC provides a stable labelled-line code for stimuli across saccades.
- Published
- 2017
- Full Text
- View/download PDF
36. Subjective experience of sensation in anorexia nervosa
- Author
-
Jennifer M. Groh, Cynthia M. Bulik, Jennifer E. Wildes, Nancy Zucker, Rhonda M. Merwin, and Ashley A. Moskovich
- Subjects
Adult ,medicine.medical_specialty ,Anorexia Nervosa ,Adolescent ,media_common.quotation_subject ,Experimental and Cognitive Psychology ,Article ,Young Adult ,Sensation ,Body Image ,medicine ,Humans ,Habituation ,Young adult ,Temperament ,Psychiatry ,Kinesthesis ,media_common ,Proprioception ,Awareness ,Middle Aged ,medicine.disease ,Psychiatry and Mental health ,Clinical Psychology ,Eating disorders ,Anorexia nervosa (differential diagnoses) ,Case-Control Studies ,Harm avoidance ,Female ,Self Report ,Psychology ,Clinical psychology - Abstract
The nature of disturbance in body experience in anorexia nervosa (AN) remains poorly operationalized despite its prognostic significance. We examined the relationship of subjective reports of sensitivity to and behavioral avoidance of sensory experience (e.g., to touch, motion) to body image disturbance and temperament in adult women currently diagnosed with AN (n = 20), women with a prior history of AN who were weight restored (n = 15), and healthy controls with no eating disorder history (n = 24). Levels of sensitivity to sensation and attempts to avoid sensory experience were significantly higher in both clinical groups relative to healthy controls. Sensory sensitivity was associated with body image disturbance (r(56) = .51, p .0001), indicating that body image disturbance increased with increased global sensitivity to sensation. Sensory sensitivity was also negatively and significantly correlated with lowest BMI (r(2) = -.32, p .001), but not current BMI (r(2) = .03, p = .18), and to the temperament feature of harm avoidance in both clinical groups. We discuss how intervention strategies that address sensitization and habituation to somatic experience via conditioning exercises may provide a new manner in which to address body image disturbance in AN.
- Published
- 2013
- Full Text
- View/download PDF
37. Evidence for time division multiplexing: Single neurons may encode simultaneous stimuli by switching between activity patterns
- Author
-
Jennifer M. Groh, Valeria C. Caruso, Shawn M. Willett, Surya T. Tokdar, Rolando Estrada, Akinori F. Ebihara, Winrich A. Freiwald, Jeff T. Mohl, Chris Glynn, Jungah Lee, and Azeem Zaman
- Subjects
Inferior colliculus ,0303 health sciences ,Visual perception ,Interleaving ,Local field potential ,Biology ,ENCODE ,Inferotemporal cortex ,03 medical and health sciences ,0302 clinical medicine ,Alternation (linguistics) ,Neuroscience ,030217 neurology & neurosurgery ,030304 developmental biology - Abstract
How the brain preserves information about multiple simultaneous items is poorly understood. We report that single neurons can represent multiple different stimuli by interleaving different signals across time. We record single units in an auditory region, the inferior colliculus, while monkeys localize 1 or 2 simultaneous sounds. During dual-sound trials, we find that some neurons fluctuate between firing rates observed for each single sound, either on a whole-trial or on a sub-trial timescale. These fluctuations are correlated in pairs of neurons, can be predicted by the state of local field potentials prior to sound onset, and, in one monkey, can predict which sound will be reported first. We find corroborating evidence of fluctuating activity patterns in a separate data set involving responses of inferotemporal cortex neurons to multiple visual stimuli. Alternation between activity patterns corresponding to each of multiple items may therefore be a general strategy to enhance the brain processing capacity, potentially linking such disparate phenomena as variable neural firing, neural oscillations, and limits in attentional/memory capacity.
- Published
- 2017
- Full Text
- View/download PDF
38. Distribution of eye position information in the monkey inferior colliculus
- Author
-
David A. Bulkin and Jennifer M. Groh
- Subjects
Male ,Sound localization ,Inferior colliculus ,Supplementary eye field ,Auditory Pathways ,Eye Movements ,genetic structures ,Physiology ,Brain mapping ,biology.animal ,Animals ,Primate ,Sound Localization ,Brain Mapping ,Inferior Colliculi ,biology ,General Neuroscience ,Eye movement ,Articles ,Haplorhini ,eye diseases ,Eye position ,Female ,sense organs ,Neuroscience - Abstract
The inferior colliculus (IC) is thought to have two main subdivisions, a central region that forms an important stop on the ascending auditory pathway and a surrounding shell region that may play a more modulatory role. In this study, we investigated whether eye position affects activity in both the central and shell regions. Accordingly, we mapped the location of eye position-sensitive neurons in six monkeys making spontaneous eye movements by sampling multiunit activity at regularly spaced intervals throughout the IC. We used a functional map based on auditory response patterns to estimate the anatomical location of recordings, in conjunction with structural MRI and histology. We found eye position-sensitive sites throughout the IC, including at 27% of sites in tonotopically organized recording penetrations (putatively the central nucleus). Recordings from surrounding tissue showed a larger proportion of sites indicating an influence of eye position (33–43%). When present, the magnitude of the change in activity due to eye position was often comparable to that seen for sound frequency. Our results indicate that the primary ascending auditory pathway is influenced by the position of the eyes. Because eye position is essential for visual-auditory integration, our findings suggest that computations underlying visual-auditory integration begin early in the ascending auditory pathway.
- Published
- 2012
- Full Text
- View/download PDF
39. The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing
- Author
-
David L. K. Murphy, Cole D. Jenson, Jennifer M. Groh, Kurtis G. Gruters, David W. Smith, and Christopher A. Shera
- Subjects
Male ,Auditory Pathways ,Tympanic Membrane ,genetic structures ,Acoustics and Ultrasonics ,otoacoustic emissions ,02 engineering and technology ,Audiology ,0302 clinical medicine ,Hearing ,0202 electrical engineering, electronic engineering, information engineering ,Multidisciplinary ,05 social sciences ,Brain ,Biological Sciences ,saccade ,reference frame ,medicine.anatomical_structure ,PNAS Plus ,EMREO ,Saccade ,Female ,020201 artificial intelligence & image processing ,Psychology ,Eardrum ,Adult ,medicine.medical_specialty ,Adolescent ,Saccadic eye movement ,0206 medical engineering ,Sensory system ,Stimulus (physiology) ,050105 experimental psychology ,Young Adult ,03 medical and health sciences ,Arts and Humanities (miscellaneous) ,otorhinolaryngologic diseases ,Saccades ,medicine ,Animals ,Humans ,0501 psychology and cognitive sciences ,Ear canal ,middle ear muscles ,Eye movement ,Macaca mulatta ,020601 biomedical engineering ,eye diseases ,Fixation (visual) ,sense organs ,Binaural recording ,Photic Stimulation ,030217 neurology & neurosurgery ,Neuroscience - Abstract
Significance The peripheral hearing system contains several motor mechanisms that allow the brain to modify the auditory transduction process. Movements or tensioning of either the middle ear muscles or the outer hair cells modifies eardrum motion, producing sounds that can be detected by a microphone placed in the ear canal (e.g., as otoacoustic emissions). Here, we report a form of eardrum motion produced by the brain via these systems: oscillations synchronized with and covarying with the direction and amplitude of saccades. These observations suggest that a vision-related process modulates the first stage of hearing. In particular, these eye movement-related eardrum oscillations may help the brain connect sights and sounds despite changes in the spatial relationship between the eyes and the ears., Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.
- Published
- 2018
- Full Text
- View/download PDF
40. A Rate Code for Sound Azimuth in Monkey Auditory Cortex: Implications for Human Neuroimaging Studies
- Author
-
Uri Werner-Reiss and Jennifer M. Groh
- Subjects
Male ,Sound localization ,Time Factors ,Population ,Normal Distribution ,Action Potentials ,Auditory cortex ,Brain mapping ,Article ,Functional Laterality ,Lateralization of brain function ,Neuroimaging ,Reaction Time ,medicine ,Animals ,Sound Localization ,education ,Auditory Cortex ,Neurons ,Analysis of Variance ,Brain Mapping ,education.field_of_study ,medicine.diagnostic_test ,General Neuroscience ,Macaca mulatta ,Magnetic Resonance Imaging ,Sound ,Acoustic Stimulation ,Receptive field ,Female ,Functional magnetic resonance imaging ,Psychology ,Neuroscience - Abstract
Is sound location represented in the auditory cortex of humans and monkeys? Human neuroimaging experiments have had only mixed success at demonstrating sound location sensitivity in primary auditory cortex. This is in apparent conflict with studies in monkeys and other animals, in which single-unit recording studies have found stronger evidence for spatial sensitivity. Does this apparent discrepancy reflect a difference between humans and animals, or does it reflect differences in the sensitivity of the methods used for assessing the representation of sound location? The sensitivity of imaging methods such as functional magnetic resonance imaging depends on the following two key aspects of the underlying neuronal population: (1) what kind of spatial sensitivity individual neurons exhibit and (2) whether neurons with similar response preferences are clustered within the brain.To address this question, we conducted a single-unit recording study in monkeys. We investigated the nature of spatial sensitivity in individual auditory cortical neurons to determine whether they have receptive fields (place code) or monotonic (rate code) sensitivity to sound azimuth. Second, we tested how strongly the population of neurons favors contralateral locations. We report here that the majority of neurons show predominantly monotonic azimuthal sensitivity, forming a rate code for sound azimuth, but that at the population level the degree of contralaterality is modest. This suggests that the weakness of the evidence for spatial sensitivity in human neuroimaging studies of auditory cortex may be attributable to limited lateralization at the population level, despite what may be considerable spatial sensitivity in individual neurons.
- Published
- 2008
- Full Text
- View/download PDF
41. Seeing sounds: visual and auditory interactions in the brain
- Author
-
Jennifer M. Groh and David A. Bulkin
- Subjects
Auditory Cortex ,Auditory Pathways ,Modalities ,genetic structures ,General Neuroscience ,media_common.quotation_subject ,Brain ,food and beverages ,Sensory maps and brain development ,Sensory system ,Sensory neuroscience ,Perception ,Auditory Perception ,Visual Perception ,Animals ,Humans ,Visual Pathways ,Nerve Net ,Psychology ,Neuroscience ,Visual Cortex ,Cognitive psychology ,media_common - Abstract
Objects and events can often be detected by more than one sensory system. Interactions between sensory systems can offer numerous benefits for the accuracy and completeness of the perception. Recent studies involving visual-auditory interactions have highlighted the perceptual advantages of combining information from these two modalities and have suggested that predominantly unimodal brain regions play a role in multisensory processing.
- Published
- 2006
- Full Text
- View/download PDF
42. Representation of Eye Position in Primate Inferior Colliculus
- Author
-
Ryan R. Metzger, Jennifer M. Groh, and Kristin K. Porter
- Subjects
Male ,Inferior colliculus ,Supplementary eye field ,Visual perception ,Eye Movements ,genetic structures ,Physiology ,Fixation, Ocular ,biology.animal ,Reaction Time ,Animals ,Primate ,Sound Localization ,Communication ,biology ,business.industry ,General Neuroscience ,Representation (systemics) ,Macaca mulatta ,Inferior Colliculi ,eye diseases ,Eye position ,Acoustic Stimulation ,Visual Fields ,Psychology ,business ,Neuroscience ,Photic Stimulation ,psychological phenomena and processes - Abstract
We studied the representation of eye-position information in the primate inferior colliculus (IC). Monkeys fixated visual stimuli at one of eight or nine locations along the horizontal meridian between −24 and 24° while sounds were presented from loudspeakers at locations within that same range. Approximately 40% of our sample of 153 neurons showed statistically significant sensitivity to eye position during either the presentation of an auditory stimulus or in the absence of sound (Bonferroni corrected P < 0.05). The representation for eye position was predominantly monotonic and favored contralateral eye positions. Eye-position sensitivity was more prevalent among neurons without sound-location sensitivity: about half of neurons that were insensitive to sound location were sensitive to eye position, whereas only about one-quarter of sound-location-sensitive neurons were also sensitive to eye position. Our findings suggest that sound location and eye position are encoded using independent but overlapping rate codes at the level of the IC. The use of a common format has computational advantages for integrating these two signals. The differential distribution of eye-position sensitivity and sound-location sensitivity suggests that this process has begun by the level of the IC but is not yet complete at this stage. We discuss how these signals might fit into Groh and Sparks' vector subtraction model for coordinate transformations.
- Published
- 2006
- Full Text
- View/download PDF
43. Long lasting attenuation by prior sounds in auditory cortex of awake primates
- Author
-
Kristin K. Porter, Uri Werner-Reiss, Jennifer M. Groh, and Abigail M. Underhill
- Subjects
Time Factors ,media_common.quotation_subject ,Action Potentials ,Context (language use) ,Auditory cortex ,Perception ,Second sound ,Reaction Time ,otorhinolaryngologic diseases ,medicine ,Animals ,Wakefulness ,Habituation ,Sound (geography) ,media_common ,Auditory Cortex ,Neurons ,geography ,geography.geographical_feature_category ,General Neuroscience ,Memoria ,Auditory Threshold ,Dose-Response Relationship, Radiation ,Haplorhini ,medicine.anatomical_structure ,Acoustic Stimulation ,Auditory Perception ,Neuron ,Psychology ,Neuroscience - Abstract
How the brain responds to sequences of sounds is a question of great relevance to a variety of auditory perceptual phenomena. We investigated how long the responses of neurons in the primary auditory cortex of awake monkeys are influenced by the previous sound. We found that responses to the second sound of a two-sound sequence were generally attenuated compared to the response that sound evoked when it was presented first. The attenuation remained evident at the population level even out to inter-stimulus intervals (ISIs) of 5 s, although it was of modest size for ISIs >2 s. Behavioral context (performance versus non-performance of a visual fixation task during sound presentation) did not influence the results. The long time course of the first sound’s influence suggests that, under natural conditions, neural responses in auditory cortex are rarely governed solely by the current sound.
- Published
- 2005
- Full Text
- View/download PDF
44. Auditory Saccades From Different Eye Positions in the Monkey: Implications for Coordinate Transformations
- Author
-
Yale E. Cohen, Abigail M. Underhill, Ryan R. Metzger, O'Dhaniel A. Mullette-Gillman, and Jennifer M. Groh
- Subjects
Superior Colliculi ,Eye Movements ,genetic structures ,Physiology ,Reward ,Saccades ,Animals ,Computer vision ,Sound Localization ,Spatial analysis ,Communication ,business.industry ,General Neuroscience ,Superior colliculus ,Frame (networking) ,Macaca mulatta ,eye diseases ,Acoustic Stimulation ,Touch ,Saccade ,Regression Analysis ,sense organs ,Artificial intelligence ,Psychology ,business ,Photic Stimulation - Abstract
Auditory spatial information arises in a head-centered coordinate frame, whereas the saccade command signals generated by the superior colliculus (SC) are thought to specify target locations in an eye-centered frame. However, auditory activity in the SC appears to be neither head- nor eye-centered but in a reference frame that is intermediate between both of these reference frames. This neurophysiological finding suggests that auditory saccades might not fully compensate for changes in initial eye position. Here, we investigated whether the accuracy of saccades to sounds is affected by initial eye position in rhesus monkeys. We found that, on average, a 12° horizontal shift in initial eye position produced only a 0.6 to 1.6° horizontal shift in the endpoints of auditory saccades made to targets at a range of locations along the horizontal meridian. This shift was similar in size to the modest influence of eye position on visual saccades. This virtually complete compensation for initial eye position implies that auditory activity in the SC is read out in a manner that is appropriate for generating accurate saccades to sounds.
- Published
- 2004
- Full Text
- View/download PDF
45. Eye Position Affects Activity in Primary Auditory Cortex of Primates
- Author
-
Kristin A. Kelly, Abigail M. Underhill, Amanda S. Trause, Jennifer M. Groh, and Uri Werner-Reiss
- Subjects
Primates ,Auditory Pathways ,Visual perception ,genetic structures ,Auditory area ,Action Potentials ,Sensory system ,Fixation, Ocular ,Visual system ,Biology ,Auditory cortex ,General Biochemistry, Genetics and Molecular Biology ,03 medical and health sciences ,0302 clinical medicine ,Animals ,Visual Pathways ,Sound Localization ,030304 developmental biology ,Auditory Cortex ,0303 health sciences ,Agricultural and Biological Sciences(all) ,Biochemistry, Genetics and Molecular Biology(all) ,Multisensory integration ,eye diseases ,Acoustic Stimulation ,Head Movements ,Auditory imagery ,Female ,General Agricultural and Biological Sciences ,Neuroscience ,030217 neurology & neurosurgery ,Reference frame - Abstract
Background: Neurons in primary auditory cortex are known to be sensitive to the locations of sounds in space, but the reference frame for this spatial sensitivity has not been investigated. Conventional wisdom holds that the auditory and visual pathways employ different reference frames, with the auditory pathway using a head-centered reference frame and the visual pathway using an eye-centered reference frame. Reconciling these discrepant reference frames is therefore a critical component of multisensory integration. Results: We tested the reference frame of neurons in the auditory cortex of primates trained to fixate visual stimuli at different orbital positions. We found that eye position altered the activity of about one third of the neurons in this region (35 of 113, or 31%). Eye position affected not only the responses to sounds (26 of 113, or 23%), but also the spontaneous activity (14 of 113, or 12%). Such effects were also evident when monkeys moved their eyes freely in the dark. Eye position and sound location interacted to produce a representation for auditory space that was neither head- nor eye-centered in reference frame. Conclusions: Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information.
- Published
- 2003
- Full Text
- View/download PDF
46. 1. Thinking about Space
- Author
-
Jennifer M. Groh
- Subjects
Vertical thinking ,Algebra ,Computer science ,Space (commercial competition) - Published
- 2014
- Full Text
- View/download PDF
47. 10. Thinking about Thinking
- Author
-
Jennifer M. Groh
- Subjects
Critical thinking ,Convergent thinking ,Psychology ,Lateral thinking ,Epistemology - Published
- 2014
- Full Text
- View/download PDF
48. 3. Sensing Our Own Shape
- Author
-
Jennifer M. Groh
- Published
- 2014
- Full Text
- View/download PDF
49. 6. Moving with Maps and Meters
- Author
-
Jennifer M. Groh
- Subjects
Geodesy ,Geology - Published
- 2014
- Full Text
- View/download PDF
50. 9. Space and Memory
- Author
-
Jennifer M. Groh
- Subjects
Distributed shared memory ,Flat memory model ,Computer science ,Interleaved memory ,Cache-only memory architecture ,Parallel computing ,Visual short-term memory ,Space (commercial competition) ,Memory map ,Computer memory - Published
- 2014
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.