Eye to Action | Tracing Visual Computations from the Retina to Behavior

Summary
Every second, millions of bits of information enter our eyes. How does the brain identify task-relevant information from this gigantic input stream? This key challenge of the visual system is tackled by substantially processing the sensory input for behavioral relevance. Already in the retina, the front-end of the visual system, neuronal circuits extract multiple features from the environment and form up to 40 channels to the brain. So far, however, one of the basic principles underlying vision, namely how the brain processes these multiple channels from the eye, remains fundamentally unclear.
In this proposal, I will focus on the superior colliculus, an evolutionary conserved retino-recipient brain area critically involved in visuomotor transformations, and solve the following problem: How do neuronal circuits in mouse superior colliculus integrate retinal signals to drive behavior? I will implement an interdisciplinary approach that combines population imaging of neuronal activity in behaving mice with optogenetic manipulations, deep neural network modeling, and quantitative behavioral analysis. By modeling a defined brain area all the way from the retinal input to its role in behavior in an ecological setting, I will mechanistically dissect the behavioral relevance of retinal circuits.
My work will uncover general principles of how diverse retinal channels are represented in downstream targets, identify elemental convergence rules of feedforward retinal signals in postsynaptic neurons, and causally link retinal function to distinct behaviors. If successful, my proposal will reveal fundamental neuronal and computational mechanisms used by the visual system to convert a complex visual input into action. My approach can be adapted to other sensory modalities, guiding the design of innovative experiments and analyses. The identified mechanisms of efficient information processing will also contribute to the development of robust neuro-inspired artificial intelligence.
Results, demos, etc. Show all and search (0)
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101117156
Start date: 01-01-2025
End date: 31-12-2029
Total budget - Public funding: 1 871 465,00 Euro - 1 871 465,00 Euro
Cordis data

Original description

Every second, millions of bits of information enter our eyes. How does the brain identify task-relevant information from this gigantic input stream? This key challenge of the visual system is tackled by substantially processing the sensory input for behavioral relevance. Already in the retina, the front-end of the visual system, neuronal circuits extract multiple features from the environment and form up to 40 channels to the brain. So far, however, one of the basic principles underlying vision, namely how the brain processes these multiple channels from the eye, remains fundamentally unclear.
In this proposal, I will focus on the superior colliculus, an evolutionary conserved retino-recipient brain area critically involved in visuomotor transformations, and solve the following problem: How do neuronal circuits in mouse superior colliculus integrate retinal signals to drive behavior? I will implement an interdisciplinary approach that combines population imaging of neuronal activity in behaving mice with optogenetic manipulations, deep neural network modeling, and quantitative behavioral analysis. By modeling a defined brain area all the way from the retinal input to its role in behavior in an ecological setting, I will mechanistically dissect the behavioral relevance of retinal circuits.
My work will uncover general principles of how diverse retinal channels are represented in downstream targets, identify elemental convergence rules of feedforward retinal signals in postsynaptic neurons, and causally link retinal function to distinct behaviors. If successful, my proposal will reveal fundamental neuronal and computational mechanisms used by the visual system to convert a complex visual input into action. My approach can be adapted to other sensory modalities, guiding the design of innovative experiments and analyses. The identified mechanisms of efficient information processing will also contribute to the development of robust neuro-inspired artificial intelligence.

Status

SIGNED

Call topic

ERC-2023-STG

Update Date

12-03-2024
Images
No images available.
Geographical location(s)