TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 21061

Differential Equations and Continuous-Time Deep Learning Cancelled

( 07. Feb – 12. Feb, 2021 )

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/21061

Ersetzt durch
Dagstuhl-Seminar 22332: Differential Equations and Continuous-Time Deep Learning (2022-08-15 - 2022-08-19) (Details)

Organisatoren

Kontakt

Motivation

Deep models have revolutionised machine learning due to their remarkable ability to iteratively construct more and more refined representations of data over the layers. Perhaps unsurprisingly very deep learning architectures have recently been shown to converge to differential equation models, which are ubiquitous in sciences but so far overlooked in machine learning. This striking connection opens new avenues of theory and practise of continuous-time machine learning inspired by physical sciences. Simultaneously neural networks have started to emerge as powerful alternatives to cumbersome mechanistic dynamical systems. Finally, deep learning models in conjecture with stochastic gradient optimization has been used to numerically solve high-dimensional partial differential equations. Thus, we have entered a new era of continuous-time modelling in machine learning.

This change in perspective is currently gaining interest rapidly across domains and provides an excellent and topical opportunity to bring together experts in dynamical systems, computational science, machine learning and the relevant scientific domains to lay solid foundations of these efforts. On the other hand, as the scientific communities, events and outlets are significantly disjoint, it is key to organize an interdisciplinary event and establish novel communication channels to ensure the distribution of relevant knowledge.

Over the course of this Dagstuhl Seminar, we want to establish strong contacts, communication and collaboration of the different research communities. Let’s have an exchange of each community’s best practices, known pitfalls and tricks-of-the-trade. We will try to identify the most important open questions and avenues forward to foster interdisciplinary research. To this end, this seminar will feature not only individual contributed talks, but also general discussions and “collaboration bazaars”, for which participants will have the possibility to pitch ideas for break-out project sessions to each other. In the break-out sessions, participants may discuss open problems, joint research obstacles, or community building work.

Copyright David Duvenaud, Markus O. Heinonen, Michael Schober, and Max Welling

Teilnehmer
  • David Duvenaud (University of Toronto, CA) [dblp]
  • Markus Heinonen (Aalto University, FI) [dblp]
  • Michael Schober (Bosch Center for AI - Renningen, DE) [dblp]
  • Max Welling (University of Amsterdam, NL) [dblp]

Klassifikation
  • Machine Learning
  • Numerical Analysis

Schlagworte
  • Deep learning
  • differential equations
  • numerics
  • statistics
  • dynamical systems