23. – 28. Januar 2022, Dagstuhl-Seminar 22042

Privacy Protection of Automated and Self-Driving Vehicles


Frank Kargl (Universität Ulm, DE)
Ioannis Krontiris (Huawei Technologies – München, DE)
André Weimerskirch (Lear Corporation – Ann Arbor, US)
Ian Williams (University of Michigan – Ann Arbor, US)

Auskunft zu diesem Dagstuhl-Seminar erteilt

Dagstuhl Service Team


Gemeinsame Dokumente
Programm des Dagstuhl-Seminars [pdf]


Automated and autonomous vehicles (AVs) may be the greatest disruptive innovation to travel that we have experienced in a century. Their development coincides with the appearance of connected vehicles. To achieve their goals, connected and automated vehicles require extensive data and machine learning algorithms processing data from local sensors and received from other cars and road-side infrastructure for their decision-making. Specifically, we are seeing the emergence of vehicles that feature an impressive array of sensors and on-board decision-making units capable of coping with an unprecedented amount of data.

While privacy for connected vehicles has been considered for many years, AV technology is still in its infancy and the privacy and data protection aspects for AV are not well addressed. The capabilities of AVs pose new challenges to privacy protection, given the large sensor arrays of AVs that collect data in public spaces. The massive introduction of sensors and AI technology into automated and autonomous vehicles opens up substantial new privacy and data protection problems, both from the technology research perspective as well as the legal and policy perspective, which still need to be phrased clearly, elaborated on and resolved.

The goal of this Dagstuhl Seminar is twofold:

First, to bring legal and technology experts (for both privacy and automated driving) together to allow an informed and open discussion between those often disjoined groups. Only such a discussion will provide a strong basis for mutual understanding and for achieving the second goal.

This second goal is to produce a scientific roadmap that evaluates where technology, as currently developed and foreseen, falls short of legal data protection requirements and identify directions how these could be met by adjusting the way technology develops, for example by developing and integrating new privacy-enhancing technologies. This could, for example, focus on areas like:

  • Establishment of Trust: A promising research direction is to investigate the integration of trusted computing technologies and (partially) shifting trust from the back-end infrastructure to the edge (i.e., vehicles). Another development we aim to discuss is how to leverage software-based Trusted Execution Environments (TEEs) in order to confine processing of personal data within a secure enclave that is verified by remote attestation to be in a certified process that will not process personal data outside of the declared purpose.
  • Advanced AI Techniques. We identify two examples here that we want to analyze: We first want to investigate advanced AI techniques like federated learning, which enables developers to train based on the shared models on their decentralized devices or servers with the local dataset. A second topic concerns anonymization of video recorded by vehicle cameras. A solution to respect privacy is to anonymize the recorded data immediately, by e.g. blurring faces and licence plates. However, applying these techniques in the training set can impact the environment detection quality of vehicles at various degrees, which is something that is not well understood so far.

One question is whether technical solutions can satisfy the variety of international legal requirements. The other question is whether such solutions are sufficiently mature and reliable to be integrated into a self-driving car.

So, we will also address the question if these legal frameworks are sufficiently forward looking to cover the new challenges created by AV and intelligent transportation or whether they would - in the worst case - even block technological progress by not allowing, for example, efficient collection of training material for machine learning systems. We also ask if the variety of data protection regimes worldwide would be a hindrance and harmonization should be sought.

Motivation text license
  Creative Commons BY 4.0
  Frank Kargl, Ioannis Krontiris, André Weimerskirch, and Ian Williams


  • Computers And Society
  • Cryptography And Security
  • Emerging Technologies


  • Privacy and Data Protection
  • Automotive Security and Privacy


In der Reihe Dagstuhl Reports werden alle Dagstuhl-Seminare und Dagstuhl-Perspektiven-Workshops dokumentiert. Die Organisatoren stellen zusammen mit dem Collector des Seminars einen Bericht zusammen, der die Beiträge der Autoren zusammenfasst und um eine Zusammenfassung ergänzt.


Download Übersichtsflyer (PDF).

Dagstuhl's Impact

Bitte informieren Sie uns, wenn eine Veröffentlichung ausgehend von Ihrem Seminar entsteht. Derartige Veröffentlichungen werden von uns in der Rubrik Dagstuhl's Impact separat aufgelistet  und im Erdgeschoss der Bibliothek präsentiert.


Es besteht weiterhin die Möglichkeit, eine umfassende Kollektion begutachteter Arbeiten in der Reihe Dagstuhl Follow-Ups zu publizieren.