Summary
Over the past two decades, we witnessed impressive advancements in Robotics. Amongst the most disruptive developments was the demonstration of small Unmanned Aerial Vehicles (UAVs) equipped with onboard cameras conducting autonomous, vision-based flights without reliance on GPS, sparking booming interest in a plethora of use-cases, such as automation of drone delivery, and infrastructure inspection and maintenance. This led to the emergence of new algorithms, advanced sensors, as well as miniaturized, powerful chips, opening exciting opportunities for automating single- as well as multi-UAV navigation. Current solutions, however, lack greatly in robustness and generality, struggling to perform outside very controlled settings, with onboard perception constituting the biggest impediment.
Working in this area for over a decade, it is troubling that despite dramatic progress, we still lack the technology to enable a UAV swarm to autonomously scan the seas for refugee dinghies or forest areas for wildfires, and to provide help in such dire situations. While, in principle, the core technology is the same across all use-cases, battling adverse conditions, such as wind, smoke, and degraded illumination, render the latter use-cases extremely challenging as they are time-critical and cannot be postponed until favorable conditions arise. Employing some of the currently most promising sensors, such as lidar, and advanced depth, thermal, event, and visual cameras, SkEyes proposes to address fundamental research questions to understand how to process and combine these heterogeneous sensing cues onboard a swarm of small UAVs. The goal is to achieve joint spatial understanding and scene awareness for effective autonomy in highly dynamic and realistic scenarios. Engaging such eyes in the sky, the focus is on robust, collaborative perception to enable intelligent UAV swarm navigation exhibiting adaptability in completing a mission in challenging conditions, at the push of a button.
Working in this area for over a decade, it is troubling that despite dramatic progress, we still lack the technology to enable a UAV swarm to autonomously scan the seas for refugee dinghies or forest areas for wildfires, and to provide help in such dire situations. While, in principle, the core technology is the same across all use-cases, battling adverse conditions, such as wind, smoke, and degraded illumination, render the latter use-cases extremely challenging as they are time-critical and cannot be postponed until favorable conditions arise. Employing some of the currently most promising sensors, such as lidar, and advanced depth, thermal, event, and visual cameras, SkEyes proposes to address fundamental research questions to understand how to process and combine these heterogeneous sensing cues onboard a swarm of small UAVs. The goal is to achieve joint spatial understanding and scene awareness for effective autonomy in highly dynamic and realistic scenarios. Engaging such eyes in the sky, the focus is on robust, collaborative perception to enable intelligent UAV swarm navigation exhibiting adaptability in completing a mission in challenging conditions, at the push of a button.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101089328 |
Start date: | 01-10-2023 |
End date: | 30-09-2028 |
Total budget - Public funding: | 2 346 219,00 Euro - 2 346 219,00 Euro |
Cordis data
Original description
Over the past two decades, we witnessed impressive advancements in Robotics. Amongst the most disruptive developments was the demonstration of small Unmanned Aerial Vehicles (UAVs) equipped with onboard cameras conducting autonomous, vision-based flights without reliance on GPS, sparking booming interest in a plethora of use-cases, such as automation of drone delivery, and infrastructure inspection and maintenance. This led to the emergence of new algorithms, advanced sensors, as well as miniaturized, powerful chips, opening exciting opportunities for automating single- as well as multi-UAV navigation. Current solutions, however, lack greatly in robustness and generality, struggling to perform outside very controlled settings, with onboard perception constituting the biggest impediment.Working in this area for over a decade, it is troubling that despite dramatic progress, we still lack the technology to enable a UAV swarm to autonomously scan the seas for refugee dinghies or forest areas for wildfires, and to provide help in such dire situations. While, in principle, the core technology is the same across all use-cases, battling adverse conditions, such as wind, smoke, and degraded illumination, render the latter use-cases extremely challenging as they are time-critical and cannot be postponed until favorable conditions arise. Employing some of the currently most promising sensors, such as lidar, and advanced depth, thermal, event, and visual cameras, SkEyes proposes to address fundamental research questions to understand how to process and combine these heterogeneous sensing cues onboard a swarm of small UAVs. The goal is to achieve joint spatial understanding and scene awareness for effective autonomy in highly dynamic and realistic scenarios. Engaging such eyes in the sky, the focus is on robust, collaborative perception to enable intelligent UAV swarm navigation exhibiting adaptability in completing a mission in challenging conditions, at the push of a button.
Status
SIGNEDCall topic
ERC-2022-COGUpdate Date
12-03-2024
Images
No images available.
Geographical location(s)