TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 21061

Differential Equations and Continuous-Time Deep Learning Cancelled

( Feb 07 – Feb 12, 2021 )

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/21061

Replacement
Dagstuhl Seminar 22332: Differential Equations and Continuous-Time Deep Learning (2022-08-15 - 2022-08-19) (Details)

Organizers

Contact

Motivation

Deep models have revolutionised machine learning due to their remarkable ability to iteratively construct more and more refined representations of data over the layers. Perhaps unsurprisingly very deep learning architectures have recently been shown to converge to differential equation models, which are ubiquitous in sciences but so far overlooked in machine learning. This striking connection opens new avenues of theory and practise of continuous-time machine learning inspired by physical sciences. Simultaneously neural networks have started to emerge as powerful alternatives to cumbersome mechanistic dynamical systems. Finally, deep learning models in conjecture with stochastic gradient optimization has been used to numerically solve high-dimensional partial differential equations. Thus, we have entered a new era of continuous-time modelling in machine learning.

This change in perspective is currently gaining interest rapidly across domains and provides an excellent and topical opportunity to bring together experts in dynamical systems, computational science, machine learning and the relevant scientific domains to lay solid foundations of these efforts. On the other hand, as the scientific communities, events and outlets are significantly disjoint, it is key to organize an interdisciplinary event and establish novel communication channels to ensure the distribution of relevant knowledge.

Over the course of this Dagstuhl Seminar, we want to establish strong contacts, communication and collaboration of the different research communities. Let’s have an exchange of each community’s best practices, known pitfalls and tricks-of-the-trade. We will try to identify the most important open questions and avenues forward to foster interdisciplinary research. To this end, this seminar will feature not only individual contributed talks, but also general discussions and “collaboration bazaars”, for which participants will have the possibility to pitch ideas for break-out project sessions to each other. In the break-out sessions, participants may discuss open problems, joint research obstacles, or community building work.

Copyright David Duvenaud, Markus O. Heinonen, Michael Schober, and Max Welling

Participants
  • David Duvenaud (University of Toronto, CA) [dblp]
  • Markus Heinonen (Aalto University, FI) [dblp]
  • Michael Schober (Bosch Center for AI - Renningen, DE) [dblp]
  • Max Welling (University of Amsterdam, NL) [dblp]

Classification
  • Machine Learning
  • Numerical Analysis

Keywords
  • Deep learning
  • differential equations
  • numerics
  • statistics
  • dynamical systems