SELFCEPTION | Robotic self/other distinction for interaction under uncertainty

Summary
There will be around 35 million private or non-industrial use robots in the world by 2018, a market of 19 billion euros. However, autonomous robot technology in Europe is not yet ready to lead this high expectancy due to the lack of robust functionality in uncertain environments. Particularly, safe interaction is an essential requirement. A basic skill, still unachieved, is to allow the robot to be aware of its own body and perceive other agents. Recent evidence suggests that self/other distinction will be a major breakthrough for improving interaction and might be the connection between low-level sensorimotor abilities and conceptual interpretation.

Advanced sensorimotor learning combined with new multimodal sensing devices, such as artificial skin, makes now possible that the robot acquires its perceptual representation, and I hypothesize that learning the multisensory-motor spatiotemporal contingencies permits self/other distinction. Hence, the aim of the project is to provide a hierarchical probabilistic model for self/other distinction in robots, learning the sensorimotor contingencies during interaction. This model not only provides a holistic solution for building the perceptual schema and improves interaction under uncertainty but it might also give insights about how humans construct their own perceptual representation and the sense of agency. Finally, the model will be tested in a whole body sensing humanoid and validated in a service robot in collaboration with a robotics SME. I will use an interdisciplinary approach that combines probabilistic and information sciences modelling with cognitive psychology, creating a highly attractive career profile.

SELFCEPTION will boost the materialization of the next generation of perceptive robots: multisensory machines able to build their perceptual body schema and distinguish their actions from other entities. We already have robots that navigate and now it is the time to develop robots that interact.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/741941
Start date: 01-05-2017
End date: 30-04-2019
Total budget - Public funding: 159 460,80 Euro - 159 460,00 Euro
Cordis data

Original description

There will be around 35 million private or non-industrial use robots in the world by 2018, a market of 19 billion euros. However, autonomous robot technology in Europe is not yet ready to lead this high expectancy due to the lack of robust functionality in uncertain environments. Particularly, safe interaction is an essential requirement. A basic skill, still unachieved, is to allow the robot to be aware of its own body and perceive other agents. Recent evidence suggests that self/other distinction will be a major breakthrough for improving interaction and might be the connection between low-level sensorimotor abilities and conceptual interpretation.

Advanced sensorimotor learning combined with new multimodal sensing devices, such as artificial skin, makes now possible that the robot acquires its perceptual representation, and I hypothesize that learning the multisensory-motor spatiotemporal contingencies permits self/other distinction. Hence, the aim of the project is to provide a hierarchical probabilistic model for self/other distinction in robots, learning the sensorimotor contingencies during interaction. This model not only provides a holistic solution for building the perceptual schema and improves interaction under uncertainty but it might also give insights about how humans construct their own perceptual representation and the sense of agency. Finally, the model will be tested in a whole body sensing humanoid and validated in a service robot in collaboration with a robotics SME. I will use an interdisciplinary approach that combines probabilistic and information sciences modelling with cognitive psychology, creating a highly attractive career profile.

SELFCEPTION will boost the materialization of the next generation of perceptive robots: multisensory machines able to build their perceptual body schema and distinguish their actions from other entities. We already have robots that navigate and now it is the time to develop robots that interact.

Status

CLOSED

Call topic

MSCA-IF-2016

Update Date

28-04-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon 2020
H2020-EU.1. EXCELLENT SCIENCE
H2020-EU.1.3. EXCELLENT SCIENCE - Marie Skłodowska-Curie Actions (MSCA)
H2020-EU.1.3.2. Nurturing excellence by means of cross-border and cross-sector mobility
H2020-MSCA-IF-2016
MSCA-IF-2016