8 results on '"Tyler S. Manning"'
Search Results
2. Transformations of sensory information in the brain reflect a changing definition of optimality
- Author
-
Tyler S. Manning, Emma Alexander, Bruce G. Cumming, Gregory C. DeAngelis, Xin Huang, and Emily A. Cooper
- Subjects
Article - Abstract
Neurons throughout the brain modulate their firing rate lawfully in response to changes in sensory input. Theories of neural computation posit that these modulations reflect the outcome of a constrained optimization: neurons aim to efficiently and robustly represent sensory information under resource limitations. Our understanding of how this optimization varies across the brain, however, is still in its infancy. Here, we show that neural responses transform along the dorsal stream of the visual system in a manner consistent with a transition from optimizing for information preservation to optimizing for perceptual discrimination. Focusing on binocular disparity – the slight differences in how objects project to the two eyes – we re-analyze measurements from neurons characterizing tuning curves in macaque monkey brain regions V1, V2, and MT, and compare these to measurements of the natural visual statistics of binocular disparity. The changes in tuning curve characteristics are computationally consistent with a shift in optimization goals from maximizing the information encoded about naturally occurring binocular disparities to maximizing the ability to support fine disparity discrimination. We find that a change towards tuning curves preferring larger disparities is a key driver of this shift. These results provide new insight into previously-identified differences between disparity-selective regions of cortex and suggest these differences play an important role in supporting visually-guided behavior. Our findings support a key re-framing of optimal coding in regions of the brain that contain sensory information, emphasizing the need to consider not just information preservation and neural resources, but also relevance to behavior.SignificanceA major role of the brain is to transform information from the sensory organs into signals that can be used to guide behavior. Neural activity is noisy and can consume large amount of energy, so sensory neurons must optimize their information processing so as to limit energy consumption while maintaining key behaviorally-relevant information. In this report, we re-examine classically-defined brain areas in the visual processing hierarchy, and ask whether neurons in these areas vary lawfully in how they represent sensory information. Our results suggest that neurons in these brain areas shift from being an optimal conduit of sensory information to optimally supporting perceptual discrimination during natural tasks.
- Published
- 2023
3. Perceptual Adaptation to Continuous Versus Intermittent Exposure to Spatial Distortions
- Author
-
Iona R. McLean, Tyler S. Manning, and Emily A. Cooper
- Subjects
Adult ,Male ,Depth Perception ,Vision Disparity ,Physiological ,Neurosciences ,spectacles ,adaptation ,Biological Sciences ,distortions ,Ophthalmology & Optometry ,Adaptation, Physiological ,Medical and Health Sciences ,Young Adult ,Eyeglasses ,Clinical Research ,Humans ,binocular disparity ,Cues ,Eye Disease and Disorders of Vision - Abstract
PurposeTo examine perceptual adaptation when people wear spectacles that produce unequal retinal image magnification.MethodsTwo groups of 15 participants (10 male; mean age 25.6 ± 4.9 years) wore spectacles with a 3.8% horizontal magnifier over one eye. The continuous-wear group wore the spectacles for 5 hours straight. The intermittent-wear group wore them for five 1-hour intervals. To measure slant and shape distortions produced by the spectacles, participants adjusted visual stimuli until they appeared frontoparallel or equiangular, respectively. Adaptation was quantified as the difference in responses at the beginning and end of wearing the spectacles. Aftereffects were quantified as the difference before and after removing the spectacles. We hypothesized that intermittent wear may lead to visual cue reweighting, so we fit a cue combination model to the data and examined changes in weights given to perspective and binocular disparity slant cues.ResultsBoth groups experienced significant shape adaptation and aftereffects. The continuous-wear group underwent significant slant adaptation and the intermittent group did not, but there was no significant difference between groups, suggesting that the difference in adaptation was negligible. There was no evidence for cue reweighting in the intermittent wear group, but unexpectedly, the weight given to binocular disparity cues for slant increased significantly in the continuous-wear group.ConclusionsWe did not find strong evidence that adaptation to spatial distortions differed between the two groups. However, there may be differences in the cue weighting strategies employed when spectacles are worn intermittently or continuously.
- Published
- 2022
4. A general framework for inferring Bayesian ideal observer models from psychophysical data
- Author
-
Tyler S. Manning, Benjamin N. Naecker, Iona R. McLean, Bas Rokers, Jonathan W. Pillow, and Emily A. Cooper
- Subjects
General Neuroscience ,General Medicine - Abstract
A central question in neuroscience is how sensory inputs are transformed into percepts. At this point, it is clear that this process is strongly influenced by prior knowledge of the sensory environment. Bayesian ideal observer models provide a useful link between data and theory that can help researchers evaluate how prior knowledge is represented and integrated with incoming sensory information. However, the statistical prior employed by a Bayesian observer cannot be measured directly, and must instead be inferred from behavioral measurements. Here, we review the general problem of inferring priors from psychophysical data, and the simple solution that follows from assuming a prior that is a Gaussian probability distribution. As our understanding of sensory processing advances, however, there is an increasing need for methods to flexibly recover the shape of Bayesian priors that are not well approximated by elementary functions. To address this issue, we describe a novel approach that applies to arbitrary prior shapes, which we parameterize using mixtures of Gaussian distributions. After incorporating a simple approximation, this method produces an analytical solution for psychophysical quantities that can be numerically optimized to recover the shapes of Bayesian priors. This approach offers advantages in flexibility, while still providing an analytical framework for many scenarios. We provide a MATLAB toolbox implementing key computations described herein.
- Published
- 2022
5. Humans make non-ideal inferences about world motion
- Author
-
Tyler S Manning, Jonathan W Pillow, Bas Rokers, and Emily A Cooper
- Subjects
Ophthalmology ,Sensory Systems - Published
- 2022
- Full Text
- View/download PDF
6. Estimating perceptual priors with finite experiments
- Author
-
Benjamin Naecker, Jonathan W. Pillow, Tyler S Manning, Emily A. Cooper, Iona McLean, and Bas Rokers
- Subjects
Ophthalmology ,Computer science ,business.industry ,Perception ,media_common.quotation_subject ,Prior probability ,Pattern recognition ,Artificial intelligence ,business ,Sensory Systems ,media_common - Published
- 2021
- Full Text
- View/download PDF
7. Retinal stabilization reveals limited influence of extraretinal signals on heading tuning in the medial superior temporal area
- Author
-
Tyler S Manning and Kenneth H. Britten
- Subjects
active vision ,genetic structures ,Computer science ,Eye ,Medical and Health Sciences ,chemistry.chemical_compound ,0302 clinical medicine ,Research Articles ,media_common ,0303 health sciences ,General Neuroscience ,05 social sciences ,oculomotor ,Fixation ,Temporal Lobe ,medicine.anatomical_structure ,corollary discharge ,Neurological ,Female ,Smooth ,Cues ,Algorithms ,Dorsum ,media_common.quotation_subject ,3D vision ,Sensory system ,Fixation, Ocular ,Optic Flow ,Stimulus (physiology) ,Retina ,050105 experimental psychology ,03 medical and health sciences ,Ocular ,Orientation ,Perception ,medicine ,Animals ,Visual Pathways ,0501 psychology and cognitive sciences ,Motion perception ,Eye Disease and Disorders of Vision ,030304 developmental biology ,Neurology & Neurosurgery ,reafference ,Psychology and Cognitive Sciences ,Neurosciences ,Eye movement ,Retinal ,Medial superior temporal area ,Macaca mulatta ,Pursuit, Smooth ,eye diseases ,Electrophysiological Phenomena ,Visual cortex ,chemistry ,Pursuit ,retinal flow ,Neuroscience ,Photic Stimulation ,Psychomotor Performance ,030217 neurology & neurosurgery - Abstract
Heading perception in primates depends heavily on visual optic-flow cues. Yet during self-motion, heading percepts remain stable even though smooth-pursuit eye movements often distort optic flow. Electrophysiological studies have identified visual areas in monkey cortex, including the dorsal medial superior temporal area (MSTd), that signal the true heading direction during pursuit. According to theoretical work, self-motion can be represented accurately by compensating for these distortions in two ways: via retinal mechanisms or via extraretinal efference-copy signals, which predict the sensory consequences of movement. Psychophysical evidence strongly supports the efference-copy hypothesis, but physiological evidence remains inconclusive. Neurons that signal the true heading direction during pursuit are found in visual areas of monkey cortex, including the dorsal medial superior temporal area (MSTd). Here we measured heading tuning in MSTd using a novel stimulus paradigm, in which we stabilize the optic-flow stimulus on the retina during pursuit. This approach isolates the effects on neuronal heading preferences of extraretinal signals, which remain active while the retinal stimulus is prevented from changing. Our results demonstrate a significant but small influence of extraretinal signals on the preferred heading directions of MSTd neurons. Under our stimulus conditions, which are rich in retinal cues, we find that retinal mechanisms dominate physiological corrections for pursuit eye movements, suggesting that extraretinal cues, such as predictive efference-copy mechanisms, have a limited role under naturalistic conditions.Significance StatementSensory systems discount stimulation caused by the animal’s own behavior. For example, eye movements cause irrelevant retinal signals that could interfere with motion perception. The visual system compensates for such self-generated motion, but how this happens is unclear. Two theoretical possibilities are a purely visual calculation or one using an internal signal of eye movements to compensate for their effects. Such a signal can be isolated by experimentally stabilizing the image on a moving retina, but this approach has never been adopted to study motion physiology. Using this method, we find that eye-movement signals have little influence on neural activity in visual cortex, while feed-forward visual calculation has a strong effect and is likely important under real-world conditions.
- Published
- 2019
- Full Text
- View/download PDF
8. Motion Processing in Primates
- Author
-
Tyler S Manning and Kenneth H. Britten
- Subjects
biology ,business.industry ,Computer science ,biology.animal ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Primate ,Motion vision ,Computer vision ,Artificial intelligence ,Motion processing ,business - Abstract
The ability to see motion is critical to survival in a dynamic world. Decades of physiological research have established that motion perception is a distinct sub-modality of vision supported by a network of specialized structures in the nervous system. These structures are arranged hierarchically according to the spatial scale of the calculations they perform, with more local operations preceding those that are more global. The different operations serve distinct purposes, from the interception of small moving objects to the calculation of self-motion from image motion spanning the entire visual field. Each cortical area in the hierarchy has an independent representation of visual motion. These representations, together with computational accounts of their roles, provide clues to the functions of each area. Comparisons between neural activity in these areas and psychophysical performance can identify which representations are sufficient to support motion perception. Experimental manipulation of this activity can also define which areas are necessary for motion-dependent behaviors like self-motion guidance.
- Published
- 2017
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.