Summary
Advances in wearable displays and networked devices lead to the exciting possibility that humans can transcend the senses they were born with and learn to ‘see’ the world in radically new ways. Genuinely incorporating new signals in our sensory repertoire would transform our everyday experience, from social encounters to surgery, and advance us towards a technologically-enhanced ‘transhuman’ state. In contrast, current additions to sensory streams such as navigating with GPS are far from being incorporated into our natural perception: we interpret them effortfully, like words from a foreign menu, rather than feeling them directly. In this project, we use a ground-breaking new approach to test how new sensory signals can be incorporated into the fundamental human experience. We train participants using new immersive virtual-reality paradigms developed in our lab, which give us unprecedented speed, control and flexibility. We test what is learned by comparing different mathematical model predictions with perceptual performance. This model-based approach uniquely shows when new signals are integrated into standard sensory processing. We compare neuroimaging data with model predictions to detect integration of newly-learned signals within brain circuits processing familiar signals. We test predictions that short-term changes to normal visual input can improve adult plasticity, and measure age-changes in plasticity by testing 8- to 12-year-old children. In a wide-ranging design allowing for domain-general conclusions, we work across modalities (visual, auditory, tactile) and across two fundamental perceptual problems: judging spatial layout (‘where’ objects are) and material properties (‘what’ they are made of). The work will provide fundamental insights into computational and brain mechanisms underlying sensory learning, and a platform for transcending the limits of human perception.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/820185 |
Start date: | 01-05-2019 |
End date: | 31-10-2025 |
Total budget - Public funding: | 1 955 952,50 Euro - 1 955 952,00 Euro |
Cordis data
Original description
Advances in wearable displays and networked devices lead to the exciting possibility that humans can transcend the senses they were born with and learn to ‘see’ the world in radically new ways. Genuinely incorporating new signals in our sensory repertoire would transform our everyday experience, from social encounters to surgery, and advance us towards a technologically-enhanced ‘transhuman’ state. In contrast, current additions to sensory streams such as navigating with GPS are far from being incorporated into our natural perception: we interpret them effortfully, like words from a foreign menu, rather than feeling them directly. In this project, we use a ground-breaking new approach to test how new sensory signals can be incorporated into the fundamental human experience. We train participants using new immersive virtual-reality paradigms developed in our lab, which give us unprecedented speed, control and flexibility. We test what is learned by comparing different mathematical model predictions with perceptual performance. This model-based approach uniquely shows when new signals are integrated into standard sensory processing. We compare neuroimaging data with model predictions to detect integration of newly-learned signals within brain circuits processing familiar signals. We test predictions that short-term changes to normal visual input can improve adult plasticity, and measure age-changes in plasticity by testing 8- to 12-year-old children. In a wide-ranging design allowing for domain-general conclusions, we work across modalities (visual, auditory, tactile) and across two fundamental perceptual problems: judging spatial layout (‘where’ objects are) and material properties (‘what’ they are made of). The work will provide fundamental insights into computational and brain mechanisms underlying sensory learning, and a platform for transcending the limits of human perception.Status
SIGNEDCall topic
ERC-2018-COGUpdate Date
27-04-2024
Images
No images available.
Geographical location(s)