SAGE | SAGE

Summary
Worldwide data volumes are exploding and islands of storage remote from compute will not scale. We will demonstrate the first instance of intelligent data storage, uniting data processing and storage as two sides of the same rich computational model. This will enable sophisticated, intention-aware data processing to be integrated within a storage systems infrastructure, combined with the potential for Exabyte scale deployment in future generations of extreme scale HPC systems.

Enabling only the salient data to flow in and out of compute nodes, from a sea of devices spanning next generation solid state to low performance disc we enable a vision of a new model of highly efficient and effective HPC and Big Data demonstrated through the SAGE project.

Objectives
- Provide a next-generation multi-tiered object-based data storage system (hardware and enabling software) supporting future-generation multi-tier persistent storage media supporting integral computational capability, within a hierarchy.
- Significantly improve overall scientific output through advancements in systemic data access performance and drastically reduced data movements.
- Provides a roadmap of technologies supporting data access for both Exascale/Exabyte and High Performance Data Analytics.
- Provide programming models, access methods and support tools validating their usability, including ‘Big-Data’ access and analysis methods
- Co-Designing and validating on a smaller representative system with earth sciences, meteorology, clean energy, and physics communities
- Projecting suitability for extreme scaling through simulation based on evaluation results.

Call Alignment: We address storage data access with optimised systems for converged Big Data and HPC use, in a co-design process with scientific partners and applications from many domains. System effectiveness and power efficiency are dramatically improved through minimized data transfer, with extreme scaling and resilience.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/671500
Start date: 01-09-2015
End date: 31-08-2018
Total budget - Public funding: 7 882 531,25 Euro - 7 882 531,00 Euro
Cordis data

Original description

Worldwide data volumes are exploding and islands of storage remote from compute will not scale. We will demonstrate the first instance of intelligent data storage, uniting data processing and storage as two sides of the same rich computational model. This will enable sophisticated, intention-aware data processing to be integrated within a storage systems infrastructure, combined with the potential for Exabyte scale deployment in future generations of extreme scale HPC systems.

Enabling only the salient data to flow in and out of compute nodes, from a sea of devices spanning next generation solid state to low performance disc we enable a vision of a new model of highly efficient and effective HPC and Big Data demonstrated through the SAGE project.

Objectives
- Provide a next-generation multi-tiered object-based data storage system (hardware and enabling software) supporting future-generation multi-tier persistent storage media supporting integral computational capability, within a hierarchy.
- Significantly improve overall scientific output through advancements in systemic data access performance and drastically reduced data movements.
- Provides a roadmap of technologies supporting data access for both Exascale/Exabyte and High Performance Data Analytics.
- Provide programming models, access methods and support tools validating their usability, including ‘Big-Data’ access and analysis methods
- Co-Designing and validating on a smaller representative system with earth sciences, meteorology, clean energy, and physics communities
- Projecting suitability for extreme scaling through simulation based on evaluation results.

Call Alignment: We address storage data access with optimised systems for converged Big Data and HPC use, in a co-design process with scientific partners and applications from many domains. System effectiveness and power efficiency are dramatically improved through minimized data transfer, with extreme scaling and resilience.

Status

CLOSED

Call topic

FETHPC-1-2014

Update Date

27-04-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon 2020
H2020-EU.1. EXCELLENT SCIENCE
H2020-EU.1.2. EXCELLENT SCIENCE - Future and Emerging Technologies (FET)
H2020-EU.1.2.2. FET Proactive
H2020-FETHPC-2014
FETHPC-1-2014 HPC Core Technologies, Programming Environments and Algorithms for Extreme Parallelism and Extreme Data Applications