Summary
Imagine navigating through your kitchen, walking around your table and chairs, reaching the counter, sliding a cup on the countertop, pouring yourself some coffee, and preparing a sandwich. Such an every-day situation requires a sequence of behaviors involving the sophisticated interplay of perception, attention, cognition, planning, and action, all unfolding in an ambiguous and uncertain world. Yet, although past research in the behavioral sciences has revealed many properties of these constituent faculties, the understanding of how humans coordinate these cognitive abilities when carrying out every-day tasks is still rudimentary.
The goal of this project is to carry out behavioral experiments and develop computational models to identify and characterize perception, cognition, and action in the wild. Our hypothesis is, that sequential actions involving perceptual uncertainty, action variability, internal and external costs and benefits can be understood and modeled in the unified framework of optimal sequential decision making under uncertainty. To achieve this goal, we integrate ideas and methods from psychophysics, statistical decision theory, active vision, intuitive physics, optimal control under uncertainty, navigation, and inverse reinforcement learning. The proposal is structured in such a way that subsequent work packages consider tasks with increasing naturalness, from classical psychophysical production tasks, visuomtor control, to navigation and sandwich making. We will develop methods that allow recovering subjects’ perceptual uncertainties, beliefs about tasks, their subjective costs and benefits including effort, and action variability. Based on our previous experimental work on visuomotor behavior and theoretical work in modeling such behavior on an individual-by-individual and trial-by-trial basis, we seek to elucidate, how perception, cognition, and action are inseparably intertwined in extended, sequential behavior.
The goal of this project is to carry out behavioral experiments and develop computational models to identify and characterize perception, cognition, and action in the wild. Our hypothesis is, that sequential actions involving perceptual uncertainty, action variability, internal and external costs and benefits can be understood and modeled in the unified framework of optimal sequential decision making under uncertainty. To achieve this goal, we integrate ideas and methods from psychophysics, statistical decision theory, active vision, intuitive physics, optimal control under uncertainty, navigation, and inverse reinforcement learning. The proposal is structured in such a way that subsequent work packages consider tasks with increasing naturalness, from classical psychophysical production tasks, visuomtor control, to navigation and sandwich making. We will develop methods that allow recovering subjects’ perceptual uncertainties, beliefs about tasks, their subjective costs and benefits including effort, and action variability. Based on our previous experimental work on visuomotor behavior and theoretical work in modeling such behavior on an individual-by-individual and trial-by-trial basis, we seek to elucidate, how perception, cognition, and action are inseparably intertwined in extended, sequential behavior.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101045783 |
Start date: | 01-10-2022 |
End date: | 30-09-2027 |
Total budget - Public funding: | 1 964 000,00 Euro - 1 964 000,00 Euro |
Cordis data
Original description
Imagine navigating through your kitchen, walking around your table and chairs, reaching the counter, sliding a cup on the countertop, pouring yourself some coffee, and preparing a sandwich. Such an every-day situation requires a sequence of behaviors involving the sophisticated interplay of perception, attention, cognition, planning, and action, all unfolding in an ambiguous and uncertain world. Yet, although past research in the behavioral sciences has revealed many properties of these constituent faculties, the understanding of how humans coordinate these cognitive abilities when carrying out every-day tasks is still rudimentary.The goal of this project is to carry out behavioral experiments and develop computational models to identify and characterize perception, cognition, and action in the wild. Our hypothesis is, that sequential actions involving perceptual uncertainty, action variability, internal and external costs and benefits can be understood and modeled in the unified framework of optimal sequential decision making under uncertainty. To achieve this goal, we integrate ideas and methods from psychophysics, statistical decision theory, active vision, intuitive physics, optimal control under uncertainty, navigation, and inverse reinforcement learning. The proposal is structured in such a way that subsequent work packages consider tasks with increasing naturalness, from classical psychophysical production tasks, visuomtor control, to navigation and sandwich making. We will develop methods that allow recovering subjects’ perceptual uncertainties, beliefs about tasks, their subjective costs and benefits including effort, and action variability. Based on our previous experimental work on visuomotor behavior and theoretical work in modeling such behavior on an individual-by-individual and trial-by-trial basis, we seek to elucidate, how perception, cognition, and action are inseparably intertwined in extended, sequential behavior.
Status
SIGNEDCall topic
ERC-2021-COGUpdate Date
09-02-2023
Images
No images available.
Geographical location(s)