Summary
Different materials, such as silk, soil, steel and soap exhibit an astonishing variety of physical properties, appearances and behaviours. The material properties of objects and substances are central to practically every task we perform, from selecting and preparing food, to detecting slippery ground, to using tools effectively. Without touching a surface, we usually enjoy a vivid impression of what it would feel like, through the sense of sight. Yet, how we do so remains mysterious. Decades of research has focussed on the visual recognition of objects, faces and scenes. By comparison, how we see, think about and interact with ‘stuff’ has been relatively neglected. STUFF addresses this major gap in our understanding.
Material perception poses unique and fascinating challenges. The image of a surface is a complex and ambiguous combination of lighting, shape, and material properties. How does the visual system disentangle these intermingled physical factors? Deformable materials like liquids and textiles move and change shape in complex yet lawful ways. How do we infer intrinsic properties like viscosity, compliance and elasticity from such ever-changing stimuli? And how do we reason about and predict their future behaviours as they interact with their surroundings? How do we adapt our own interactions with objects to take into consideration their hardness, density, friction and other physical characteristics, allowing us to pluck a raspberry without crushing it, or pick up wet soap without it slipping through our fingers?
STUFF takes a radically interdisciplinary approach to these questions in five tightly interconnected work-packages, bringing together state-of-the-art methods from experimental psychology and behavioural neuroscience, computer graphics and computational image analysis, machine learning and even art. We draw on real and simulated materials, to uncover how we perceive, reason about, predict and interact with materials and their properties.
Material perception poses unique and fascinating challenges. The image of a surface is a complex and ambiguous combination of lighting, shape, and material properties. How does the visual system disentangle these intermingled physical factors? Deformable materials like liquids and textiles move and change shape in complex yet lawful ways. How do we infer intrinsic properties like viscosity, compliance and elasticity from such ever-changing stimuli? And how do we reason about and predict their future behaviours as they interact with their surroundings? How do we adapt our own interactions with objects to take into consideration their hardness, density, friction and other physical characteristics, allowing us to pluck a raspberry without crushing it, or pick up wet soap without it slipping through our fingers?
STUFF takes a radically interdisciplinary approach to these questions in five tightly interconnected work-packages, bringing together state-of-the-art methods from experimental psychology and behavioural neuroscience, computer graphics and computational image analysis, machine learning and even art. We draw on real and simulated materials, to uncover how we perceive, reason about, predict and interact with materials and their properties.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101098225 |
Start date: | 01-10-2023 |
End date: | 30-09-2028 |
Total budget - Public funding: | 2 499 711,00 Euro - 2 499 711,00 Euro |
Cordis data
Original description
Different materials, such as silk, soil, steel and soap exhibit an astonishing variety of physical properties, appearances and behaviours. The material properties of objects and substances are central to practically every task we perform, from selecting and preparing food, to detecting slippery ground, to using tools effectively. Without touching a surface, we usually enjoy a vivid impression of what it would feel like, through the sense of sight. Yet, how we do so remains mysterious. Decades of research has focussed on the visual recognition of objects, faces and scenes. By comparison, how we see, think about and interact with ‘stuff’ has been relatively neglected. STUFF addresses this major gap in our understanding.Material perception poses unique and fascinating challenges. The image of a surface is a complex and ambiguous combination of lighting, shape, and material properties. How does the visual system disentangle these intermingled physical factors? Deformable materials like liquids and textiles move and change shape in complex yet lawful ways. How do we infer intrinsic properties like viscosity, compliance and elasticity from such ever-changing stimuli? And how do we reason about and predict their future behaviours as they interact with their surroundings? How do we adapt our own interactions with objects to take into consideration their hardness, density, friction and other physical characteristics, allowing us to pluck a raspberry without crushing it, or pick up wet soap without it slipping through our fingers?
STUFF takes a radically interdisciplinary approach to these questions in five tightly interconnected work-packages, bringing together state-of-the-art methods from experimental psychology and behavioural neuroscience, computer graphics and computational image analysis, machine learning and even art. We draw on real and simulated materials, to uncover how we perceive, reason about, predict and interact with materials and their properties.
Status
SIGNEDCall topic
ERC-2022-ADGUpdate Date
31-07-2023
Images
No images available.
Geographical location(s)