IRE | Intelligent Robotic Endoscopes for Improved Healthcare Services

Summary
In Intelligent Robotic Endoscopes (IRE) for Improved Healthcare Services we envision creating intelligent robotics solutions, extending current endoscope technology with robotics control that is based on learning from currently collected human operator data, coupled with novel bio-mechanical modeling techniques, and sensory feedback as well as soft robotics phantom for training.

The challenge with colonoscopy is that the success rate of detecting cancer depends on the skills of the clinician that operates the endoscope. From a health and societal perspective, the number of colonoscopies is bound to increase as they are the only way to screen patients for early cancer detection. Many European countries have national screening programs. This is a very big market in need of improved technology.

IRE enables a new generation of intelligent robots that through data, simulation and learning can interact with the interior of a living human while communicating with a human operator. The huge variation of human anatomy and the dynamic effect of human physiology make it a complicated navigational task to use endoscopes. Entanglement, haemorrhage, and perforation risks create a critical and difficult environment to navigate autonomously in where even trained human operators meet challenges. We exploit one of the largest datasets on real-life colonoscopies with more than 2,000 operations to learn safe navigation, combined with simulated training on a population of biomechanical models of the abdominal region.

IRE boosts the design and configuration of the robotic endoscope using digital twins and simulation, and careful inclusion of clinicians will speed up the process of integration. IRE will raise the level of autonomy by building upon simulation, imaging, and learning to yield an increased interpretation and understanding of the complex real- world environments, capable of anticipating the effect of human motions, adapting and replanning to avoid entanglement.
Results, demos, etc. Show all and search (0)
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101135082
Start date: 01-03-2024
End date: 29-02-2028
Total budget - Public funding: 6 189 085,00 Euro - 6 189 082,00 Euro
Cordis data

Original description

In Intelligent Robotic Endoscopes (IRE) for Improved Healthcare Services we envision creating intelligent robotics solutions, extending current endoscope technology with robotics control that is based on learning from currently collected human operator data, coupled with novel bio-mechanical modeling techniques, and sensory feedback as well as soft robotics phantom for training.

The challenge with colonoscopy is that the success rate of detecting cancer depends on the skills of the clinician that operates the endoscope. From a health and societal perspective, the number of colonoscopies is bound to increase as they are the only way to screen patients for early cancer detection. Many European countries have national screening programs. This is a very big market in need of improved technology.

IRE enables a new generation of intelligent robots that through data, simulation and learning can interact with the interior of a living human while communicating with a human operator. The huge variation of human anatomy and the dynamic effect of human physiology make it a complicated navigational task to use endoscopes. Entanglement, haemorrhage, and perforation risks create a critical and difficult environment to navigate autonomously in where even trained human operators meet challenges. We exploit one of the largest datasets on real-life colonoscopies with more than 2,000 operations to learn safe navigation, combined with simulated training on a population of biomechanical models of the abdominal region.

IRE boosts the design and configuration of the robotic endoscope using digital twins and simulation, and careful inclusion of clinicians will speed up the process of integration. IRE will raise the level of autonomy by building upon simulation, imaging, and learning to yield an increased interpretation and understanding of the complex real- world environments, capable of anticipating the effect of human motions, adapting and replanning to avoid entanglement.

Status

SIGNED

Call topic

HORIZON-CL4-2023-DIGITAL-EMERGING-01-01

Update Date

12-03-2024
Images
No images available.
Geographical location(s)