Summary
One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning'', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and deep learning based systems have already led to breakthroughs in computer vision and speech recognition. In contrast, from the theoretical point of view, by and large, we do not understand why deep learning is at all possible, since most state of
the art theoretical results show that deep learning is computationally hard.
Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.
Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey'' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
the art theoretical results show that deep learning is computationally hard.
Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.
Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey'' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/676774 |
Start date: | 01-02-2016 |
End date: | 31-01-2021 |
Total budget - Public funding: | 1 342 500,00 Euro - 1 342 500,00 Euro |
Cordis data
Original description
One of the most significant recent developments in applied machine learning has been the resurgence of ``deep learning'', usually in the form of artificial neural networks. The empirical success of deep learning is stunning, and deep learning based systems have already led to breakthroughs in computer vision and speech recognition. In contrast, from the theoretical point of view, by and large, we do not understand why deep learning is at all possible, since most state ofthe art theoretical results show that deep learning is computationally hard.
Bridging this gap is a great challenge since it involves proficiency in several theoretic fields (algorithms, complexity, and statistics) and at the same time requires a good understanding of real world practical problems and the ability to conduct applied research. We believe that a good theory must lead to better practical algorithms. It should also broaden the applicability of learning in general, and deep learning in particular, to new domains. Such a practically relevant theory may also lead to a fundamental paradigm shift in the way we currently analyze the complexity of algorithms.
Previous works by the PI and his colleagues and students have provided novel ways to analyze the computational complexity of learning algorithms and understand the tradeoffs between data and computational time. In this proposal, in order to bridge the gap between theory and practice, I suggest a departure from worst-case analyses and the development of a more optimistic, data dependent, theory with ``grey'' components. Success will lead to a breakthrough in our understanding of learning at large with significant potential for impact on the field of machine learning and its applications.
Status
CLOSEDCall topic
ERC-StG-2015Update Date
27-04-2024
Images
No images available.
Geographical location(s)