ATCOM | An Attentional Code for Memory

Summary
Declarative memory is fundamentally relational. We not only remember things or events, but also whether and how they are related to each other. Cracking the mental code that supports relational memory is one of the greatest challenges of cognitive neuroscience. Here I suggest that memories are relationally encoded through an attentional repertoire, such that different types of conceptual relationships (before/after, bigger than, part of, caused by, etc.) correspond to different dynamic attentional patterns. Such an attentional code is largely decodable from eye-movement and embodies the structure of our relational knowledge. I provide preliminary data showing that spontaneous eye movements encode conceptual similarity, as well as conceptual relationships, in abstract semantic spaces and cognitive maps; and that the eye movements of different people synchronize while listening to the same narrative even if their eyes are closed. On this basis I hypothesize that: (i) structural knowledge in the form of low-dimensional schemas is encoded in attentional templates that support relational thinking and cross-domain generalization; (ii) in principle all conceptual relationships encoded in memory can be represented through an attentional code that can be decoded from eye movements; (iii) the construction of this code through sequential movements of attention is supported by hippocampal/parietal mechanisms for sequence generation and memory consolidation; (iv) this attentional code allows us to understand each other via inter-subject synchronization of attentional schemas. I will test these hypotheses using cutting-edge eye-tracking technology, multimodal neuroimaging (fMRI, MEG, iEEG) and deep neural networks. ATCOM is a high-risk/high-gain investigation of a novel groundbreaking hypothesis that, if correct, will have a long-lasting impact in cognitive neuroscience, artificial intelligence, and related fields, reshaping the way we think about memory.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: https://cordis.europa.eu/project/id/101125658
Start date: 01-01-2025
End date: 31-12-2029
Total budget - Public funding: 1 999 358,00 Euro - 1 999 358,00 Euro
Cordis data

Original description

Declarative memory is fundamentally relational. We not only remember things or events, but also whether and how they are related to each other. Cracking the mental code that supports relational memory is one of the greatest challenges of cognitive neuroscience. Here I suggest that memories are relationally encoded through an attentional repertoire, such that different types of conceptual relationships (before/after, bigger than, part of, caused by, etc.) correspond to different dynamic attentional patterns. Such an attentional code is largely decodable from eye-movement and embodies the structure of our relational knowledge. I provide preliminary data showing that spontaneous eye movements encode conceptual similarity, as well as conceptual relationships, in abstract semantic spaces and cognitive maps; and that the eye movements of different people synchronize while listening to the same narrative even if their eyes are closed. On this basis I hypothesize that: (i) structural knowledge in the form of low-dimensional schemas is encoded in attentional templates that support relational thinking and cross-domain generalization; (ii) in principle all conceptual relationships encoded in memory can be represented through an attentional code that can be decoded from eye movements; (iii) the construction of this code through sequential movements of attention is supported by hippocampal/parietal mechanisms for sequence generation and memory consolidation; (iv) this attentional code allows us to understand each other via inter-subject synchronization of attentional schemas. I will test these hypotheses using cutting-edge eye-tracking technology, multimodal neuroimaging (fMRI, MEG, iEEG) and deep neural networks. ATCOM is a high-risk/high-gain investigation of a novel groundbreaking hypothesis that, if correct, will have a long-lasting impact in cognitive neuroscience, artificial intelligence, and related fields, reshaping the way we think about memory.

Status

SIGNED

Call topic

ERC-2023-COG

Update Date

12-03-2024
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all
Horizon Europe
HORIZON.1 Excellent Science
HORIZON.1.1 European Research Council (ERC)
HORIZON.1.1.0 Cross-cutting call topics
ERC-2023-COG ERC CONSOLIDATOR GRANTS
HORIZON.1.1.1 Frontier science
ERC-2023-COG ERC CONSOLIDATOR GRANTS