Summary
Why does the brain outperform AI? Artificial neural networks (ANNs) are at the core of the AI revolution. In the past years, enormous efforts have been made to unravel their mathematical properties, leading to fundamental insights and mathematical guarantees on when and why deep learning works well. ANNs are inspired by biological neural networks (BNNs) but differ in many respects: ANNs represent functions while BNNs represent stochastic processes, and the gradient-based deep learning applied for ANNs is very different from the local updating of BNNs.
BNNs are superior to ANNs in the sense that the brain learns faster and generalizes better. Despite the urgency for answers and the rich and interesting mathematical structures that BNNs create, scarcely any theoretical attempts have been made to understand learning in the brain. The stochastic process structure of BNNs and the need to understand the statistical convergence behavior call for a mathematical statistics approach. This project proposes the development of advanced mathematical tools in nonparametric and high- dimensional statistics to analyze learning in BNNs as a statistical method. The starting point is a novel interpretation of the local updating of BNN parameters as a specific and non-standard, derivative-free optimization method. Whereas derivative-free optimization is thought to be slow, our conjecture is that it leads to favorable statistical properties in the setting underlying BNNs.
If the research is successful, it has the potential to open a new research area in mathematical statistics and provide insights into how the brain learns. It could also lead to recommendations on how to make AI more efficient with less training data and how to train neuromorphic computer chips mimicking BNNs.
BNNs are superior to ANNs in the sense that the brain learns faster and generalizes better. Despite the urgency for answers and the rich and interesting mathematical structures that BNNs create, scarcely any theoretical attempts have been made to understand learning in the brain. The stochastic process structure of BNNs and the need to understand the statistical convergence behavior call for a mathematical statistics approach. This project proposes the development of advanced mathematical tools in nonparametric and high- dimensional statistics to analyze learning in BNNs as a statistical method. The starting point is a novel interpretation of the local updating of BNN parameters as a specific and non-standard, derivative-free optimization method. Whereas derivative-free optimization is thought to be slow, our conjecture is that it leads to favorable statistical properties in the setting underlying BNNs.
If the research is successful, it has the potential to open a new research area in mathematical statistics and provide insights into how the brain learns. It could also lead to recommendations on how to make AI more efficient with less training data and how to train neuromorphic computer chips mimicking BNNs.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/101124751 |
Start date: | 01-05-2024 |
End date: | 30-04-2029 |
Total budget - Public funding: | 2 000 000,00 Euro - 2 000 000,00 Euro |
Cordis data
Original description
Why does the brain outperform AI? Artificial neural networks (ANNs) are at the core of the AI revolution. In the past years, enormous efforts have been made to unravel their mathematical properties, leading to fundamental insights and mathematical guarantees on when and why deep learning works well. ANNs are inspired by biological neural networks (BNNs) but differ in many respects: ANNs represent functions while BNNs represent stochastic processes, and the gradient-based deep learning applied for ANNs is very different from the local updating of BNNs.BNNs are superior to ANNs in the sense that the brain learns faster and generalizes better. Despite the urgency for answers and the rich and interesting mathematical structures that BNNs create, scarcely any theoretical attempts have been made to understand learning in the brain. The stochastic process structure of BNNs and the need to understand the statistical convergence behavior call for a mathematical statistics approach. This project proposes the development of advanced mathematical tools in nonparametric and high- dimensional statistics to analyze learning in BNNs as a statistical method. The starting point is a novel interpretation of the local updating of BNN parameters as a specific and non-standard, derivative-free optimization method. Whereas derivative-free optimization is thought to be slow, our conjecture is that it leads to favorable statistical properties in the setting underlying BNNs.
If the research is successful, it has the potential to open a new research area in mathematical statistics and provide insights into how the brain learns. It could also lead to recommendations on how to make AI more efficient with less training data and how to train neuromorphic computer chips mimicking BNNs.
Status
SIGNEDCall topic
ERC-2023-COGUpdate Date
12-03-2024
Images
No images available.
Geographical location(s)