Summary
Exascale volumes of diverse data from distributed sources are continuously produced. Healthcare data stand out in the size produced (production 2020 >2000 exabytes), heterogeneity (many media, acquisition methods), included knowledge (e.g. diagnostic reports) and commercial value. The supervised nature of deep learning models requires large labeled, annotated data, which precludes models to extract knowledge and value. EXA MODE solves this by allowing easy & fast, weakly supervised knowledge discovery of exascale heterogeneous data provided by the partners, limiting human interaction. Its objectives include the development and release of extreme analytic methods and tools, that are adopted in decision making by industry and hospitals. Deep learning naturally allows to build semantic representations of entities and relations in multimodal data. Knowledge discovery is performed via document-level semantic networks in text and the extraction of homogeneous features in heterogeneous images. The results are fused, aligned to medical ontologies, visualized and refined. Knowledge is then applied using a semantic middleware to compress, segment and classify images and it is exploited in decision support and semantic knowledge management prototypes. EXA MODE is relevant to ICT12 in several aspects: 1) Challenge: it extracts knowledge and value from heterogeneous quickly increasing data volumes. 2) Scope: the consortium develops and releases new methods and concepts for extreme scale analytics to accelerate deep analysis also via data compression, for precise predictions, support decision making and visualize multi-modal knowledge. 3) Impact: the multi-modal/media semantic middleware makes heterogeneous data management & analysis easier & faster, it improves architectures for complex distributed systems with better tools increasing speed of data throughput and access, as resulting from tests in extreme analysis by industry and in hospitals.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/825292 |
Start date: | 01-01-2019 |
End date: | 30-06-2023 |
Total budget - Public funding: | 4 333 281,00 Euro - 4 333 281,00 Euro |
Cordis data
Original description
Exascale volumes of diverse data from distributed sources are continuously produced. Healthcare data stand out in the size produced (production 2020 >2000 exabytes), heterogeneity (many media, acquisition methods), included knowledge (e.g. diagnostic reports) and commercial value. The supervised nature of deep learning models requires large labeled, annotated data, which precludes models to extract knowledge and value. EXA MODE solves this by allowing easy & fast, weakly supervised knowledge discovery of exascale heterogeneous data provided by the partners, limiting human interaction. Its objectives include the development and release of extreme analytic methods and tools, that are adopted in decision making by industry and hospitals. Deep learning naturally allows to build semantic representations of entities and relations in multimodal data. Knowledge discovery is performed via document-level semantic networks in text and the extraction of homogeneous features in heterogeneous images. The results are fused, aligned to medical ontologies, visualized and refined. Knowledge is then applied using a semantic middleware to compress, segment and classify images and it is exploited in decision support and semantic knowledge management prototypes. EXA MODE is relevant to ICT12 in several aspects: 1) Challenge: it extracts knowledge and value from heterogeneous quickly increasing data volumes. 2) Scope: the consortium develops and releases new methods and concepts for extreme scale analytics to accelerate deep analysis also via data compression, for precise predictions, support decision making and visualize multi-modal knowledge. 3) Impact: the multi-modal/media semantic middleware makes heterogeneous data management & analysis easier & faster, it improves architectures for complex distributed systems with better tools increasing speed of data throughput and access, as resulting from tests in extreme analysis by industry and in hospitals.Status
SIGNEDCall topic
ICT-12-2018-2020Update Date
26-10-2022
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all