wHiSPER | investigating Human Shared PErception with Robots

Summary
Perception is a complex process, where prior knowledge is incorporated into the current percept to help the brain cope with sensory uncertainty. A crucial question is how this mechanism changes during interaction, when the brain is faced with two conflicting goals: either optimizing individual perception by using internal priors, or maximizing perceptual alignment with the partner, by limiting the reliance on individual priors. wHiSPER proposes to study for the first time how visual perception of space and time is modified during interaction, by moving the investigation to an interactive shared context, where two agents dynamically influence each other. To allow for scrupulous and systematic control during interaction, wHiSPER will use a humanoid robot as a controllable interactive agent. The research will be articulated along five main objectives: i) determine how being involved in an interactive context influences perceptual inference; ii) assess how perceptual priors generalize to the observation of other’s actions; iii) understand whether and how individual perception aligns to others’ priors; iv) assess how is it possible to enable shared perception with a robot and v) determine whether perceptual inference during interaction is modified with aging, when lowered sensory acuity could increase priors relevance. To these aims wHiSPER will exploit rigorous psychophysical methods, Bayesian modeling and human-robot interaction, by adapting well-established paradigms in the study of visual perception to a novel interactive context. In several experiments the humanoid robot and the participants will be shown simple temporal or spatial perceptual stimuli that they will have to perceive either to reproduce them or to perform a coordinated joint action (as passing an object). The measures of the reproduced intervals and of the kinematics of the actions will allow to quantify through Bayesian modeling how social interaction influences visual perception.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/804388
Start date: 01-03-2019
End date: 31-12-2024
Total budget - Public funding: 1 749 375,00 Euro - 1 749 375,00 Euro
Cordis data

Original description

Perception is a complex process, where prior knowledge is incorporated into the current percept to help the brain cope with sensory uncertainty. A crucial question is how this mechanism changes during interaction, when the brain is faced with two conflicting goals: either optimizing individual perception by using internal priors, or maximizing perceptual alignment with the partner, by limiting the reliance on individual priors. wHiSPER proposes to study for the first time how visual perception of space and time is modified during interaction, by moving the investigation to an interactive shared context, where two agents dynamically influence each other. To allow for scrupulous and systematic control during interaction, wHiSPER will use a humanoid robot as a controllable interactive agent. The research will be articulated along five main objectives: i) determine how being involved in an interactive context influences perceptual inference; ii) assess how perceptual priors generalize to the observation of other’s actions; iii) understand whether and how individual perception aligns to others’ priors; iv) assess how is it possible to enable shared perception with a robot and v) determine whether perceptual inference during interaction is modified with aging, when lowered sensory acuity could increase priors relevance. To these aims wHiSPER will exploit rigorous psychophysical methods, Bayesian modeling and human-robot interaction, by adapting well-established paradigms in the study of visual perception to a novel interactive context. In several experiments the humanoid robot and the participants will be shown simple temporal or spatial perceptual stimuli that they will have to perceive either to reproduce them or to perform a coordinated joint action (as passing an object). The measures of the reproduced intervals and of the kinematics of the actions will allow to quantify through Bayesian modeling how social interaction influences visual perception.

Status

SIGNED

Call topic

ERC-2018-STG

Update Date

27-04-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon 2020
H2020-EU.1. EXCELLENT SCIENCE
H2020-EU.1.1. EXCELLENT SCIENCE - European Research Council (ERC)
ERC-2018
ERC-2018-STG