Summary
Building on breakthrough research in the AI analysis of fluorescence and perfusion in cancer tissues, this project clinically validates the use of AI-driven imaging and decision support in real-time cancer surgery.
Cancer and healthy tissue have radically different local blood perfusion patterns. This perfusion can be captured using near-infrared video after systemic fluorophore (indocyanine green) injection. Analysis of the video can digitally identify regions of cancer by tracking the perfusion over the initial seconds after dye administration by comparing the fluorescence signal in these areas with those in adjacent normal tissue within the same endolaparoscopic field of view. Application of AI methods (including computer vision and machine learning techniques) has enabled this differential classification to occur in real time so that better, individualised surgical decisions can be taken during an operation.
In this project, we build up our existing AI solution research prototype into an operating room-standard surgical tool and validate its performance, reliability, usability and acceptance in five leading cancer surgery centres across Europe (500 patients). The validation studies address (a) generalisability across clinics; (b) biopsy and tumour identification; and (c) optimised resection of large (>3cm) rectal polyps, a key area of current surgical practice where the biggest clinical challenge ensuring accurate patient selection for curative therapy.
Training and education, communication and dissemination will be delivered by IRCAD, Europe's leading surgical education organisation.
Legal, regulatory and liability research (co-led by UCPH CeBIL Centre and PSU) and usability and acceptance research (led by surgical professional organisation EAES) will identify and address all obstacles to widespread use of this technology in particular, and of real-time AI in the operating-room in general. Draft clinical guidelines will be created for future EAES adoption.
Cancer and healthy tissue have radically different local blood perfusion patterns. This perfusion can be captured using near-infrared video after systemic fluorophore (indocyanine green) injection. Analysis of the video can digitally identify regions of cancer by tracking the perfusion over the initial seconds after dye administration by comparing the fluorescence signal in these areas with those in adjacent normal tissue within the same endolaparoscopic field of view. Application of AI methods (including computer vision and machine learning techniques) has enabled this differential classification to occur in real time so that better, individualised surgical decisions can be taken during an operation.
In this project, we build up our existing AI solution research prototype into an operating room-standard surgical tool and validate its performance, reliability, usability and acceptance in five leading cancer surgery centres across Europe (500 patients). The validation studies address (a) generalisability across clinics; (b) biopsy and tumour identification; and (c) optimised resection of large (>3cm) rectal polyps, a key area of current surgical practice where the biggest clinical challenge ensuring accurate patient selection for curative therapy.
Training and education, communication and dissemination will be delivered by IRCAD, Europe's leading surgical education organisation.
Legal, regulatory and liability research (co-led by UCPH CeBIL Centre and PSU) and usability and acceptance research (led by surgical professional organisation EAES) will identify and address all obstacles to widespread use of this technology in particular, and of real-time AI in the operating-room in general. Draft clinical guidelines will be created for future EAES adoption.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101057321 |
Start date: | 01-05-2022 |
End date: | 30-04-2026 |
Total budget - Public funding: | 5 978 718,75 Euro - 5 978 716,00 Euro |
Cordis data
Original description
Building on breakthrough research in the AI analysis of fluorescence and perfusion in cancer tissues, this project clinically validates the use of AI-driven imaging and decision support in real-time cancer surgery.Cancer and healthy tissue have radically different local blood perfusion patterns. This perfusion can be captured using near-infrared video after systemic fluorophore (indocyanine green) injection. Analysis of the video can digitally identify regions of cancer by tracking the perfusion over the initial seconds after dye administration by comparing the fluorescence signal in these areas with those in adjacent normal tissue within the same endolaparoscopic field of view. Application of AI methods (including computer vision and machine learning techniques) has enabled this differential classification to occur in real time so that better, individualised surgical decisions can be taken during an operation.
In this project, we build up our existing AI solution research prototype into an operating room-standard surgical tool and validate its performance, reliability, usability and acceptance in five leading cancer surgery centres across Europe (500 patients). The validation studies address (a) generalisability across clinics; (b) biopsy and tumour identification; and (c) optimised resection of large (>3cm) rectal polyps, a key area of current surgical practice where the biggest clinical challenge ensuring accurate patient selection for curative therapy.
Training and education, communication and dissemination will be delivered by IRCAD, Europe's leading surgical education organisation.
Legal, regulatory and liability research (co-led by UCPH CeBIL Centre and PSU) and usability and acceptance research (led by surgical professional organisation EAES) will identify and address all obstacles to widespread use of this technology in particular, and of real-time AI in the operating-room in general. Draft clinical guidelines will be created for future EAES adoption.
Status
SIGNEDCall topic
HORIZON-HLTH-2021-DISEASE-04-04Update Date
09-02-2023
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all