TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 22332

Differential Equations and Continuous-Time Deep Learning

( Aug 15 – Aug 19, 2022 )

(Click in the middle of the image to enlarge)

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/22332

Organizers

Contact


Summary

Deep models have revolutionised machine learning due to their remarkable ability to iteratively construct more and more refined representations of data over the layers. Perhaps unsurprisingly, very deep learning architectures have recently been shown to converge to differential equation models, which are ubiquitous in sciences, but so far overlooked in machine learning. This striking connection opens new avenues of theory and practice of continuous-time machine learning inspired by physical sciences. Simultaneously, neural networks have started to emerge as powerful alternatives to cumbersome mechanistic dynamical systems. Finally, deep learning models in conjecture with stochastic gradient optimisation has been used to numerically solve high-dimensional partial differential equations. Thus, we have entered a new era of continuous-time modelling in machine learning.

This change in perspective is currently gaining interest rapidly across domains and provides an excellent and topical opportunity to bring together experts in dynamical systems, computational science, machine learning and the relevant scientific domains to lay solid foundations of these efforts. On the other hand, as the scientific communities, events and outlets are significantly disjoint, it is key to organize an interdisciplinary event and establish novel communication channels to ensure the distribution of relevant knowledge.

Over the course of this Dagstuhl Seminar, we want to establish strong contacts, communication and collaboration of the different research communities. Let's have an exchange of each community's best practices, known pitfalls and tricks of the trade. We will try to identify the most important open questions and avenues forward to foster interdisciplinary research. To this end, this seminar will feature not only individual contributed talks, but also general discussions and "collaboration bazaars", for which participants will have the possibility to pitch ideas for break-out project sessions to each other. In the break-out sessions, participants may discuss open problems, joint research obstacles, or community building work.

Copyright David Duvenaud, Markus Heinonen, Michael Tiemann, and Max Welling

Motivation

Deep models have revolutionised machine learning due to their remarkable ability to iteratively construct more and more refined representations of data over the layers. Perhaps unsurprisingly, very deep learning architectures have recently been shown to converge to differential equation models, which are ubiquitous in sciences, but so far overlooked in machine learning. This striking connection opens new avenues of theory and practice of continuous-time machine learning inspired by physical sciences. Simultaneously, neural networks have started to emerge as powerful alternatives to cumbersome mechanistic dynamical systems. Finally, deep learning models in conjecture with stochastic gradient optimisation has been used to numerically solve high-dimensional partial differential equations. Thus, we have entered a new era of continuous-time modelling in machine learning.

This change in perspective is currently gaining interest rapidly across domains and provides an excellent and topical opportunity to bring together experts in dynamical systems, computational science, machine learning and the relevant scientific domains to lay solid foundations of these efforts. On the other hand, as the scientific communities, events and outlets are significantly disjoint, it is key to organize an interdisciplinary event and establish novel communication channels to ensure the distribution of relevant knowledge.

Over the course of this Dagstuhl Seminar, we want to establish strong contacts, communication and collaboration of the different research communities. Let's have an exchange of each community's best practices, known pitfalls and tricks of the trade. We will try to identify the most important open questions and avenues forward to foster interdisciplinary research. To this end, this seminar will feature not only individual contributed talks, but also general discussions and "collaboration bazaars", for which participants will have the possibility to pitch ideas for break-out project sessions to each other. In the break-out sessions, participants may discuss open problems, joint research obstacles, or community building work.

Copyright David Duvenaud, Markus O. Heinonen, Michael Tiemann, and Max Welling

Participants
  • Hananeh Aliee (Helmholtz Zentrum München, DE)
  • Jesse Bettencourt (University of Toronto, CA) [dblp]
  • Olivier Bournez (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Joachim M. Buhmann (ETH Zürich, CH) [dblp]
  • Johanne Cohen (University Paris-Saclay - Orsay, FR)
  • Biswadip Dey (Siemens - Princeton, US)
  • Remco Duits (TU Eindhoven, NL) [dblp]
  • David Duvenaud (University of Toronto, CA) [dblp]
  • Maurizio Filippone (EURECOM - Biot, FR) [dblp]
  • James Foster (University of Oxford, GB)
  • Colby Fronk (University of California - Santa Barbara, US)
  • Jan Hasenauer (Universität Bonn, DE) [dblp]
  • Markus Heinonen (Aalto University, FI) [dblp]
  • Patrick Kidger (Google X - Bay Area, US)
  • Diederik P. Kingma (Google - Mountain View, US) [dblp]
  • Linda Petzold (University of California - Santa Barbara, US) [dblp]
  • Jack Richter-Powell (University of Toronto, CA)
  • Lars Ruthotto (Emory University - Atlanta, US) [dblp]
  • Jacob Seidman (University of Pennsylvania - Philadelphia, US)
  • Arno Solin (Aalto University, FI) [dblp]
  • Sho Sonoda (RIKEN - Tokyo, JP) [dblp]
  • Nils Thuerey (TU München, DE) [dblp]
  • Michael Tiemann (Robert Bosch GmbH - Renningen, DE)
  • Filip Tronarp (Universität Tübingen, DE) [dblp]
  • Yves van Gennip (TU Delft, NL) [dblp]
  • Max Welling (University of Amsterdam, NL) [dblp]
  • Verena Wolf (Universität des Saarlandes - Saarbrücken, DE) [dblp]
  • Daniel Worrall (DeepMind - London, GB)

Classification
  • Machine Learning
  • Numerical Analysis
  • Systems and Control

Keywords
  • differential equations
  • deep learning