HomE | Enabling Homomorphic Encryption of Deep Neural Network Models and Datasets in Production Environments

Summary
Deep learning (DL) is widely used to solve classification problems previously unchallenged, such as face recognition, and presents clear use cases for privacy requirements. Homomorphic encryption (HE) enables operations upon encrypted data, at the expense of vast data size increase. RAM sizes currently limit the use of HE on DL to severely reduced use cases. Recently emerged persistent memory technology (PMEM) offers larger-than-ever RAM spaces, but its performance is far from that of customary DRAM technologies. This project aims at sparking a new class of system architectures for encrypted DL workloads, by eliminating or dramatically reducing data movements across memory/storage hierarchies and network, supported by PMEM technology, overcoming its current severe performance limitations. HomE intends to be a first-time enabler for the encrypted execution of large models that do not fit in DRAM footprints to execute local to accelerators, hundreds of DL models to run simultaneously, and large datasets to be run at high resolution and accuracy. Targeting these ground-breaking goals, HomE enters into unexplored field resulting from the innovative convergence of several disciplines, where wide-ranging research is required in order to assess current and future feasibility. Its main challenge is to develop methodology capable of breaking through the existing software and hardware limitations. HomE proposes a holistic approach yielding highly impactful outcomes that include novel comprehensive performance characterisation, innovative optimisations upon current technology, and pioneering hardware proposals. HomE can spawn a paradigm shift that will revolutionise the convergence of the machine learning and cryptography disciplines, filling a gap of knowledge and opening new horizons such as DL training on HE, currently too demanding even for DRAM. HomE, based on solid evidence, will unveil the great unknown of whether PMEM is a practical enabler for encrypted DL workloads.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101043467
Start date: 01-09-2022
End date: 31-08-2027
Total budget - Public funding: 2 680 195,00 Euro - 2 680 195,00 Euro
Cordis data

Original description

Deep learning (DL) is widely used to solve classification problems previously unchallenged, such as face recognition, and presents clear use cases for privacy requirements. Homomorphic encryption (HE) enables operations upon encrypted data, at the expense of vast data size increase. RAM sizes currently limit the use of HE on DL to severely reduced use cases. Recently emerged persistent memory technology (PMEM) offers larger-than-ever RAM spaces, but its performance is far from that of customary DRAM technologies. This project aims at sparking a new class of system architectures for encrypted DL workloads, by eliminating or dramatically reducing data movements across memory/storage hierarchies and network, supported by PMEM technology, overcoming its current severe performance limitations. HomE intends to be a first-time enabler for the encrypted execution of large models that do not fit in DRAM footprints to execute local to accelerators, hundreds of DL models to run simultaneously, and large datasets to be run at high resolution and accuracy. Targeting these ground-breaking goals, HomE enters into unexplored field resulting from the innovative convergence of several disciplines, where wide-ranging research is required in order to assess current and future feasibility. Its main challenge is to develop methodology capable of breaking through the existing software and hardware limitations. HomE proposes a holistic approach yielding highly impactful outcomes that include novel comprehensive performance characterisation, innovative optimisations upon current technology, and pioneering hardware proposals. HomE can spawn a paradigm shift that will revolutionise the convergence of the machine learning and cryptography disciplines, filling a gap of knowledge and opening new horizons such as DL training on HE, currently too demanding even for DRAM. HomE, based on solid evidence, will unveil the great unknown of whether PMEM is a practical enabler for encrypted DL workloads.

Status

SIGNED

Call topic

ERC-2021-COG

Update Date

09-02-2023
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon Europe
HORIZON.1 Excellent Science
HORIZON.1.1 European Research Council (ERC)
HORIZON.1.1.0 Cross-cutting call topics
ERC-2021-COG ERC CONSOLIDATOR GRANTS
HORIZON.1.1.1 Frontier science
ERC-2021-COG ERC CONSOLIDATOR GRANTS