TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 26312

Explainable AI in Energy and Critical Infrastructure Systems

( 26. Jul – 29. Jul, 2026 )

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/26312

Organisatoren
  • André Artelt (Universität Bielefeld, DE)
  • Kyri Baker (University of Colorado - Boulder, US)
  • Barbara Hammer (Universität Bielefeld, DE)
  • Francesco Leofante (Imperial College London, GB)

Kontakt

Motivation

Energy and critical infrastructure systems, such as power grids, water networks, and transportation networks, are the backbones of modern societies. The operation and decision-making in such systems is challenging due to limited and incomplete information, uncertainties, and longtime horizons. In this context, Artificial Intelligence (AI) holds significant potential to enhance and support the decision-making and operation of energy and critical infrastructure systems. However, given the high-stakes nature of such systems, legal regulations such as the recent European AI Act require transparency that “allows appropriate traceability and explainability” of AI systems deployed in this context. The field of explainable AI (XAI) aims to achieve transparency by explaining the internal reasoning of AI systems in a human-understandable way. It therefore offers promising, yet unexplored, solutions to satisfy those legal requirements and foster the development of supportive AI that is not only efficient but also transparent.

This Dagstuhl Seminar aims to better understand the potential and technical requirements of XAI in such systems and identify research gaps and challenges that currently prevent the use of (X)AI in energy and critical infrastructure systems. In particular, it will focus on major challenges, such as including domain knowledge about the underlying system dynamics, dealing with different time horizons and uncertainties, as well as evaluation strategies and benchmarks.

The seminar will consist of a mixture of presentations introducing relevant subtopics, such as application areas, their unique requirements and needs, as well as an overview talk on XAI to ensure a common terminology and understanding of core concepts and existing XAI methods. Fostering synergizing effects, a special focus will be on discussions in smaller and larger groups, brainstorming where and how XAI can provide benefits, as well as identification of limitations and key requirements that have to be addressed by the community. These are intended to help shape a manifesto paper, which is planned to be drafted in collaboration with all interested participants, outlining priorities and research questions across key stakeholders.

Copyright André Artelt, Kyri Baker, Barbara Hammer, and Francesco Leofante

LZI Junior Researchers

This seminar qualifies for Dagstuhl's LZI Junior Researchers program. Schloss Dagstuhl wishes to enable the participation of junior scientists with a specialisation fitting for this Dagstuhl Seminar, even if they are not on the radar of the organizers. Applications by outstanding junior scientists are possible until December 12, 2025.


Klassifikation
  • Artificial Intelligence
  • Machine Learning
  • Systems and Control

Schlagworte
  • Energy
  • Critical Infrastructure
  • Explainable AI
  • Machine Learning