Summary
Declarative memory is fundamentally relational. We not only remember things or events, but also whether and how they are related to each other. Cracking the mental code that supports relational memory is one of the greatest challenges of cognitive neuroscience. Here I suggest that memories are relationally encoded through an attentional repertoire, such that different types of conceptual relationships (before/after, bigger than, part of, caused by, etc.) correspond to different dynamic attentional patterns. Such an attentional code is largely decodable from eye-movement and embodies the structure of our relational knowledge. I provide preliminary data showing that spontaneous eye movements encode conceptual similarity, as well as conceptual relationships, in abstract semantic spaces and cognitive maps; and that the eye movements of different people synchronize while listening to the same narrative even if their eyes are closed. On this basis I hypothesize that: (i) structural knowledge in the form of low-dimensional schemas is encoded in attentional templates that support relational thinking and cross-domain generalization; (ii) in principle all conceptual relationships encoded in memory can be represented through an attentional code that can be decoded from eye movements; (iii) the construction of this code through sequential movements of attention is supported by hippocampal/parietal mechanisms for sequence generation and memory consolidation; (iv) this attentional code allows us to understand each other via inter-subject synchronization of attentional schemas. I will test these hypotheses using cutting-edge eye-tracking technology, multimodal neuroimaging (fMRI, MEG, iEEG) and deep neural networks. ATCOM is a high-risk/high-gain investigation of a novel groundbreaking hypothesis that, if correct, will have a long-lasting impact in cognitive neuroscience, artificial intelligence, and related fields, reshaping the way we think about memory.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101125658 |
Start date: | 01-01-2025 |
End date: | 31-12-2029 |
Total budget - Public funding: | 1 999 358,00 Euro - 1 999 358,00 Euro |
Cordis data
Original description
Declarative memory is fundamentally relational. We not only remember things or events, but also whether and how they are related to each other. Cracking the mental code that supports relational memory is one of the greatest challenges of cognitive neuroscience. Here I suggest that memories are relationally encoded through an attentional repertoire, such that different types of conceptual relationships (before/after, bigger than, part of, caused by, etc.) correspond to different dynamic attentional patterns. Such an attentional code is largely decodable from eye-movement and embodies the structure of our relational knowledge. I provide preliminary data showing that spontaneous eye movements encode conceptual similarity, as well as conceptual relationships, in abstract semantic spaces and cognitive maps; and that the eye movements of different people synchronize while listening to the same narrative even if their eyes are closed. On this basis I hypothesize that: (i) structural knowledge in the form of low-dimensional schemas is encoded in attentional templates that support relational thinking and cross-domain generalization; (ii) in principle all conceptual relationships encoded in memory can be represented through an attentional code that can be decoded from eye movements; (iii) the construction of this code through sequential movements of attention is supported by hippocampal/parietal mechanisms for sequence generation and memory consolidation; (iv) this attentional code allows us to understand each other via inter-subject synchronization of attentional schemas. I will test these hypotheses using cutting-edge eye-tracking technology, multimodal neuroimaging (fMRI, MEG, iEEG) and deep neural networks. ATCOM is a high-risk/high-gain investigation of a novel groundbreaking hypothesis that, if correct, will have a long-lasting impact in cognitive neuroscience, artificial intelligence, and related fields, reshaping the way we think about memory.Status
SIGNEDCall topic
ERC-2023-COGUpdate Date
12-03-2024
Images
No images available.
Geographical location(s)