DynamicVision | Closing the loop in dynamic vision – from single photons to behaviour in extreme light environments

Summary
Driving along a tree-lined avenue, we have all experienced how the rapid succession of light and shade disrupts our vision. Such conditions push even synthetic sensors to their limits, but many animals master these challenges on a daily—and nightly—basis. Indeed, a high dynamic range of sensory information is a hallmark of natural environments. Explaining how sensory information is processed with the limited bandwidth in neural circuits is key to a central goal of neuroscience: understanding the neural control of behaviour in natural contexts. This question extends beyond the processing of dynamic input by nervous systems to the closed-loop nature of animal behaviour itself: as senses guide an animal’s movements, the movements in turn shape the sensory input. It necessitates a paradigm-shift to a holistic approach considering dynamic inputs, neural processing and behavioural strategies in concert. I propose visually-guided flight in nocturnal moths as uniquely suited for approaching this challenge. Probing the system in dim light, when vision operates at its limits, offers straightforward performance readouts for all stages of the control loop. To do so, we will design a novel imaging system to quantify the dynamics of natural visual environments from a flying insect’s perspective. We will then measure how dynamic tuning adjusts peripheral neurons to compensate for these spatiotemporal light variations, and how they are integrated with movement predictions in motion neurons, to guide flight behaviour. Using a one-of-a-kind facility for large-scale animal tracking, we will record the moths’ flight behaviour at unprecedented precision to reveal the strategies that optimise sensory acquisition in these challenging light conditions. Combining all stages, this project will provide a coherent framework for studying the neural basis of natural behaviour in dynamic light environments—using a unique, ecologically impactful model to close the loop from sensing to acting.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101116305
Start date: 01-09-2024
End date: 31-08-2029
Total budget - Public funding: 1 500 000,00 Euro - 1 500 000,00 Euro
Cordis data

Original description

Driving along a tree-lined avenue, we have all experienced how the rapid succession of light and shade disrupts our vision. Such conditions push even synthetic sensors to their limits, but many animals master these challenges on a daily—and nightly—basis. Indeed, a high dynamic range of sensory information is a hallmark of natural environments. Explaining how sensory information is processed with the limited bandwidth in neural circuits is key to a central goal of neuroscience: understanding the neural control of behaviour in natural contexts. This question extends beyond the processing of dynamic input by nervous systems to the closed-loop nature of animal behaviour itself: as senses guide an animal’s movements, the movements in turn shape the sensory input. It necessitates a paradigm-shift to a holistic approach considering dynamic inputs, neural processing and behavioural strategies in concert. I propose visually-guided flight in nocturnal moths as uniquely suited for approaching this challenge. Probing the system in dim light, when vision operates at its limits, offers straightforward performance readouts for all stages of the control loop. To do so, we will design a novel imaging system to quantify the dynamics of natural visual environments from a flying insect’s perspective. We will then measure how dynamic tuning adjusts peripheral neurons to compensate for these spatiotemporal light variations, and how they are integrated with movement predictions in motion neurons, to guide flight behaviour. Using a one-of-a-kind facility for large-scale animal tracking, we will record the moths’ flight behaviour at unprecedented precision to reveal the strategies that optimise sensory acquisition in these challenging light conditions. Combining all stages, this project will provide a coherent framework for studying the neural basis of natural behaviour in dynamic light environments—using a unique, ecologically impactful model to close the loop from sensing to acting.

Status

SIGNED

Call topic

ERC-2023-STG

Update Date

12-03-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon Europe
HORIZON.1 Excellent Science
HORIZON.1.1 European Research Council (ERC)
HORIZON.1.1.0 Cross-cutting call topics
ERC-2023-STG ERC STARTING GRANTS
HORIZON.1.1.1 Frontier science
ERC-2023-STG ERC STARTING GRANTS