TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 26312

Explainable AI in Energy and Critical Infrastructure Systems

( Jul 26 – Jul 29, 2026 )

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/26312

Organizers
  • André Artelt (Universität Bielefeld, DE)
  • Kyri Baker (University of Colorado - Boulder, US)
  • Barbara Hammer (Universität Bielefeld, DE)
  • Francesco Leofante (Imperial College London, GB)

Contact

Motivation

Energy and critical infrastructure systems, such as power grids, water networks, and transportation networks, are the backbones of modern societies. The operation and decision-making in such systems is challenging due to limited and incomplete information, uncertainties, and longtime horizons. In this context, Artificial Intelligence (AI) holds significant potential to enhance and support the decision-making and operation of energy and critical infrastructure systems. However, given the high-stakes nature of such systems, legal regulations such as the recent European AI Act require transparency that “allows appropriate traceability and explainability” of AI systems deployed in this context. The field of explainable AI (XAI) aims to achieve transparency by explaining the internal reasoning of AI systems in a human-understandable way. It therefore offers promising, yet unexplored, solutions to satisfy those legal requirements and foster the development of supportive AI that is not only efficient but also transparent.

This Dagstuhl Seminar aims to better understand the potential and technical requirements of XAI in such systems and identify research gaps and challenges that currently prevent the use of (X)AI in energy and critical infrastructure systems. In particular, it will focus on major challenges, such as including domain knowledge about the underlying system dynamics, dealing with different time horizons and uncertainties, as well as evaluation strategies and benchmarks.

The seminar will consist of a mixture of presentations introducing relevant subtopics, such as application areas, their unique requirements and needs, as well as an overview talk on XAI to ensure a common terminology and understanding of core concepts and existing XAI methods. Fostering synergizing effects, a special focus will be on discussions in smaller and larger groups, brainstorming where and how XAI can provide benefits, as well as identification of limitations and key requirements that have to be addressed by the community. These are intended to help shape a manifesto paper, which is planned to be drafted in collaboration with all interested participants, outlining priorities and research questions across key stakeholders.

Copyright André Artelt, Kyri Baker, Barbara Hammer, and Francesco Leofante

LZI Junior Researchers

This seminar qualifies for Dagstuhl's LZI Junior Researchers program. Schloss Dagstuhl wishes to enable the participation of junior scientists with a specialisation fitting for this Dagstuhl Seminar, even if they are not on the radar of the organizers. Applications by outstanding junior scientists are possible until December 12, 2025.


Classification
  • Artificial Intelligence
  • Machine Learning
  • Systems and Control

Keywords
  • Energy
  • Critical Infrastructure
  • Explainable AI
  • Machine Learning