Vision-In-Flight | Neuromechanics of Insect Vision during Aerial Interactions with Applications in Visually Guided Systems

Summary
This project investigates how biological vision operates under the fastest and most challenging motion condition: flight. Specifically, we look beyond gaze stabilization and focus on directed gaze control such as object tracking. Flying insects are ideal model for studying vision in flight due to its relatively simple nervous system and the fixed optics of the compound eyes. Insect vision has elucidated fundamental circuitries of vision via psychophysics, electrophysiology, computational modelling, and connectomics. However, we have limited knowledge on how insects use vision in free flight and what visual signals influence motor control during aerial interactions. This study aims to reveal how flying insects direct their gaze in-flight to extract target information for guidance and to facilitate the execution of complex flight manoeuvres. To achieve this objective, we will advance three emerging techniques: 1) high-precision insect scale motion capture; 2) ultralight wireless neural telemetry; 3) virtual reality for freely flying insects. I was involved in developing the first two methods and they both still require significant development to suit this project. The third budded from a successful ERC project, which enabled virtual reality experiments with freely behaving animals, and also requires additional breakthrough in order to accommodate this project. By advancing these techniques together, we can fully control the visual input of a freely flying insect and simultaneously record relevant visual signals. While modern image sensors and image processing can sometimes surpass biological vision, machine vision systems today still cannot utilize some tactical benefits of directed gaze control. Indeed, learning how to look is one of the best lessons a visually guided system can take from biology. This research informs the control of autonomous systems such as self-driving cars, unmanned aerial taxi, and robotic courier which will revolutionize the upcoming era.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/804315
Start date: 01-11-2018
End date: 31-10-2025
Total budget - Public funding: 1 499 968,00 Euro - 1 499 968,00 Euro
Cordis data

Original description

This project investigates how biological vision operates under the fastest and most challenging motion condition: flight. Specifically, we look beyond gaze stabilization and focus on directed gaze control such as object tracking. Flying insects are ideal model for studying vision in flight due to its relatively simple nervous system and the fixed optics of the compound eyes. Insect vision has elucidated fundamental circuitries of vision via psychophysics, electrophysiology, computational modelling, and connectomics. However, we have limited knowledge on how insects use vision in free flight and what visual signals influence motor control during aerial interactions. This study aims to reveal how flying insects direct their gaze in-flight to extract target information for guidance and to facilitate the execution of complex flight manoeuvres. To achieve this objective, we will advance three emerging techniques: 1) high-precision insect scale motion capture; 2) ultralight wireless neural telemetry; 3) virtual reality for freely flying insects. I was involved in developing the first two methods and they both still require significant development to suit this project. The third budded from a successful ERC project, which enabled virtual reality experiments with freely behaving animals, and also requires additional breakthrough in order to accommodate this project. By advancing these techniques together, we can fully control the visual input of a freely flying insect and simultaneously record relevant visual signals. While modern image sensors and image processing can sometimes surpass biological vision, machine vision systems today still cannot utilize some tactical benefits of directed gaze control. Indeed, learning how to look is one of the best lessons a visually guided system can take from biology. This research informs the control of autonomous systems such as self-driving cars, unmanned aerial taxi, and robotic courier which will revolutionize the upcoming era.

Status

SIGNED

Call topic

ERC-2018-STG

Update Date

27-04-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon 2020
H2020-EU.1. EXCELLENT SCIENCE
H2020-EU.1.1. EXCELLENT SCIENCE - European Research Council (ERC)
ERC-2018
ERC-2018-STG