Summary
How do we learn to dance, play an instrument, or a game as complex as chess or go? How do we make a memory? The common answer to these questions is “through synaptic plasticity”, through changing the synaptic connectivity of neural circuits so that representative brain activity can be reliably triggered. Such connectivity changes are governed by rules, i.e., synaptic mechanisms which monitor the activity of their environment and stereotypically strengthen or weaken synapses accordingly. The shape and mode of operation of these rules is still largely unknown: For the more than hundred different connection types in cortical circuits, only a handful of rules has been described at all. Similarly, testing observed rules in simulations of cortical function has only seen limited success. Our slow progress is due to the extraordinary difficulty of measuring and observing synapses without interference.
Here, we propose a new approach. By utilizing the growing power of machine learning methods we can deduce synaptic plasticity rules directly. Newly developed search algorithms and sheer computational power allow us to integrate published data and infer synaptic rules in silico. We aim to (1) develop a new mathematical expression of synaptic plasticity rules, experimentally appropriate and flexible enough to be implemented in a Machine Learning framework, dubbed SYNAPSEEK. Next (2), we will apply SYNAPSEEK to deduce the rules for building various neural structures with increasing complexity. Finally (3), we will incorporate additional constraints to SYNAPSEEK to develop synaptic rules that shape network function as much as its structure. Our work will establish, for the first time, canonical sets of synaptic plasticity rules, based on the circuit structure they must produce, and the function they are meant to support. SYNAPSEEK will have immediate and wide ranging applications, from a basic understanding of cortical development to better protocols for Deep Brain Stimulation.
Here, we propose a new approach. By utilizing the growing power of machine learning methods we can deduce synaptic plasticity rules directly. Newly developed search algorithms and sheer computational power allow us to integrate published data and infer synaptic rules in silico. We aim to (1) develop a new mathematical expression of synaptic plasticity rules, experimentally appropriate and flexible enough to be implemented in a Machine Learning framework, dubbed SYNAPSEEK. Next (2), we will apply SYNAPSEEK to deduce the rules for building various neural structures with increasing complexity. Finally (3), we will incorporate additional constraints to SYNAPSEEK to develop synaptic rules that shape network function as much as its structure. Our work will establish, for the first time, canonical sets of synaptic plasticity rules, based on the circuit structure they must produce, and the function they are meant to support. SYNAPSEEK will have immediate and wide ranging applications, from a basic understanding of cortical development to better protocols for Deep Brain Stimulation.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/819603 |
Start date: | 01-06-2019 |
End date: | 31-05-2025 |
Total budget - Public funding: | 1 798 605,00 Euro - 1 798 605,00 Euro |
Cordis data
Original description
How do we learn to dance, play an instrument, or a game as complex as chess or go? How do we make a memory? The common answer to these questions is “through synaptic plasticity”, through changing the synaptic connectivity of neural circuits so that representative brain activity can be reliably triggered. Such connectivity changes are governed by rules, i.e., synaptic mechanisms which monitor the activity of their environment and stereotypically strengthen or weaken synapses accordingly. The shape and mode of operation of these rules is still largely unknown: For the more than hundred different connection types in cortical circuits, only a handful of rules has been described at all. Similarly, testing observed rules in simulations of cortical function has only seen limited success. Our slow progress is due to the extraordinary difficulty of measuring and observing synapses without interference.Here, we propose a new approach. By utilizing the growing power of machine learning methods we can deduce synaptic plasticity rules directly. Newly developed search algorithms and sheer computational power allow us to integrate published data and infer synaptic rules in silico. We aim to (1) develop a new mathematical expression of synaptic plasticity rules, experimentally appropriate and flexible enough to be implemented in a Machine Learning framework, dubbed SYNAPSEEK. Next (2), we will apply SYNAPSEEK to deduce the rules for building various neural structures with increasing complexity. Finally (3), we will incorporate additional constraints to SYNAPSEEK to develop synaptic rules that shape network function as much as its structure. Our work will establish, for the first time, canonical sets of synaptic plasticity rules, based on the circuit structure they must produce, and the function they are meant to support. SYNAPSEEK will have immediate and wide ranging applications, from a basic understanding of cortical development to better protocols for Deep Brain Stimulation.
Status
SIGNEDCall topic
ERC-2018-COGUpdate Date
27-04-2024
Images
No images available.
Geographical location(s)