Technology is putting our everyday lives under continuous scrutiny, such as through monitoring and surveillance by our phones or sensors in smart cities. At the same time, the actuation capabilities of the emerging Internet of Things add a new dimension, by allowing systems to directly affect the physical world. This, coupled with the integration of data analytics and machine learning techniques into system workflows is moving us into an increasingly automated world.
It follows that the legal and policy concerns regarding technology are increasing in salience and prominence; there are real issues regarding privacy, transparency, agency and safety as they relate to the systems underpinning society, and the data that drives them.
Accountability, however, is hindered by the nature of the technology. Systems tend to be "black boxes" that operate in a manner "invisible" to those who are affected by them. Data can easily move across both administrative and geo-political boundaries, often without a trace. The structure and composition of the systems-of-systems involved can be dynamic and complex, and the internals of data analytics techniques are often opaque. This means that even where regulations are fit-for-purpose, ascertaining compliance is difficult, although accountability is still essential.
This Dagstuhl Seminar brings together experts from the computer science and law communities, from academia and industry, to explore these important and timely challenges. Specifically, the aim is to both raise awareness of and establish new research directions concerning the accountability of systems, given directions in systems technologies; developing legal and regulatory requirements; and evolving user expectations.
Law, regulation and requirements for data management, security, confidentiality, quality and provenance should be defined with technological enforcement in mind: technologists should be legally-aware and lawyers should be technology-aware.
The more specific goals of the seminar concern providing the foundations for moving forward, and are envisaged to include:
- developing a series of concrete use-cases to illustrate the issues and challenges and to shape future research directions regarding accountable systems;
- categorising the various challenges as a taxonomy or ontology, with specific use-cases provided for the sake of illustration;
- identifying the opportunities and gaps for data management and security technologies for aligning with current and emerging law and regulation (and vice-versa); and
- considering the consequences of an increasingly automated ("algorithmic") society and the physicality of systems, to identify the legal/social transparency and control requirements in light of the potential harms that can result, and their relation to technical means for inspection and intervention.
Background and Motivation
Technology is becoming increasingly pervasive, impacting all aspects of everyday life. Our use of apps and online services is tracked and extensively processed (data analytics), and the results are used for various purposes, predominately advertising. Monitoring and surveillance by sensors in smart cities creates vast amounts of data, much of which can be identifiably linked with people. Smart home, health and lifestyle monitoring, and other sensor technologies yield sensitive personal data; mobile phones reveal people's positions, and their calls are tracked leading to data that can be used to determine social linkages and sometimes mental wellbeing. Such collection and analysis of personal data raises serious privacy concerns. A key aspiration is to provide end-users with a means to understand their digital footprints, and control the propagation, aggregation and retention of their data.
Concerns over data movement, location, processing and access have led to increasing regulation, both national and international. An example is the recently adopted EU General Data Protection Regulation (GDPR) that reinforces and expands individual rights, as well as restrictions and obligations regarding personal data. However, data moves easily beyond geographical boundaries, and use of cloud computing resources may mean that stored data may be replicated in multiple locations worldwide, with potential for conflicts between applicable laws and jurisdictions. Governments may demand access to data (whether stored locally or remotely) and this may result in complex legal disputes. Regulations, codes of conduct, and best practices can incentivise the use of particular technical mechanisms for data management. Examples include encryption and anonymisation, for example when using medical data for research. However, there are often misalignments between legalslash regulatory aims and the capabilities of the technologies.
Key issues concern how to demonstrate compliance with regulations, such as those regarding how data is handled and used, and, in cases of failure, how to hold the appropriate entities accountable. This is a particular challenge for wide-scale, federated, or cross-border systems. In large or complex systems, data may be handled by many different parties, falling under various management regimes and jurisdictions. Such concerns are not only horizontal (e.g., data being exchanged between parties, across geographic regions) but also vertical, where different levels of the services stack are managed by different parties (e.g., a company application running over a Heroku PaaS that runs over Amazon IaaS). Most end-users (people!) are oblivious to the potential complexity of such systems, let alone the complexity of the legal requirements that underpin such architectures. In general, the lack of transparency and uncertainty about the means for compliance with legal obligations, along with a lack of technical means for managing such concerns, may inhibit innovative technology development (a "chilling factor"), may escalate compliance costs, may trigger inappropriate policy responses, and may work to undermine public trust in technology.
These concerns will only grow in prominence, given the increasing deployment of sensors, generating ever-more data; actuators, giving systems physical effects; and the use of machine learning, facilitating automation. In response, this seminar brought together experts from the computer science and legal communities, spanning academia and industry, to explore issues of accountability as it relates to data and systems. The seminar aimed to: (i) raise awareness of and establish new research directions concerning issues of accountability as they relate to systems, given directions in systems technologies; (ii) explore developing legal and regulatory requirements; and (iii) investigate issues of user empowerment. A key goal was to increase awareness that law, regulation and requirements for data usage, management, security, confidentiality, quality and provenance should align with the technology, and vice versa: technologists should be legally-aware and lawyers should be technology-aware.
Due to the diverse backgrounds of the participants, the first day was focused on introductions and ensuring that everyone had a common grounding in key topics. This included a series of guided discussion sessions: Lilian Edwards provided an introduction to legal and regulatory considerations, particularly the European Union General Data Protection Regulation (GDPR); Jon Crowcroft introduced emerging technical architectures such as edge computing; Bertram Ludäscher led a session exploring data provenance; and Ben Wagner introduced broader ethical and social concerns. A motivating case study was also presented highlighting how an apparently enthusiastic view of emerging Internet of Things technologies might obscure a plethora of questionable social and policy implications.
The structure of the week included multiple breakout sessions in which working groups examined particular topics (below) and reported back summaries of their discussions at plenary sessions. The working group sessions were interspersed with an interactive case study session, that focused on the technological compliance concerns of a hypothetical global hotel chain seeking to introduce a series of IoT and cloud technologies in the current regulatory environment, and a session in which participants were able to present their recent research, abstracts for (most of) which are included in this report.
The topics explored by the working groups at the seminar spanned policy, legal and technical considerations. The topics were seeded by the organisers but were ultimately gathered from the participants through a preference allocation process. The chosen topics included:
- Trust in systems.
- Who is, could or should be accountable in complex systems?
- Engineering accountable systems.
- Is there a place for data provenance in accountable systems?
- Anonymity, identity and accountability.
- Thinking beyond consent.
- Automating the exercising of rights for collective oversight.
Each group was asked to produce an abstract summarising the key issues, challenges and ways forward from the discussion. These abstracts are included in this report, and indicate many potential opportunities for research.
Generally, it was felt that this seminar represented only the start of this important discussion. It is clear that there is a substantial and urgent need for closer interactions between the technical and legal domains, such that (i) the computer science communities better understand the legal requirements and constraints that impact the design, implementation and deployment of technology; and (ii) the legal communities gain more of a grounding in the nature, capabilities, and potential of the technology itself. It was also recognised that there is potential for better collaboration amongst different computer science communities; for example, to have greater interactions between those working in systems, provenance and machine learning.
In light of this, key to moving forward is to work to form collaborative research proposals, and to organise relevant meetings, in order to drive progress on the topics, challenges and research opportunities identified during this seminar. As issues of accountability increase in importance and urgency, it is vital that researchers across academia, industry and civil society work together to proactively confront these challenges.
- Virgilio Almeida (Federal University of Minas Gerais-Belo Horizonte, BR) [dblp]
- Jean Bacon (University of Cambridge, GB) [dblp]
- Jennifer Cobbe (University of Cambridge, GB) [dblp]
- Jon Crowcroft (University of Cambridge, GB) [dblp]
- Lilian Edwards (The University of Strathclyde - Glasgow, GB) [dblp]
- David Eyers (University of Otago, NZ) [dblp]
- Krishna P. Gummadi (MPI-SWS - Saarbrücken, DE) [dblp]
- Tristan Henderson (University of St Andrews, GB) [dblp]
- Martin Henze (RWTH Aachen, DE) [dblp]
- Melanie Herschel (Universität Stuttgart, DE) [dblp]
- Heleen Louise Janssen (Ministry of the Interior and Kindom Relations, NL)
- Joshua A. Kroll (University of California - Berkeley, US) [dblp]
- Bertram Ludäscher (University of Illinois at Urbana-Champaign, US) [dblp]
- Derek McAuley (University of Nottingham, GB) [dblp]
- Christopher Millard (Queen Mary University of London, GB) [dblp]
- Ken Moody (University of Cambridge, GB) [dblp]
- Maximilian Ott (CSIRO - Alexandria, AU) [dblp]
- Frank Pallas (TU Berlin, DE) [dblp]
- Thomas Pasquier (University of Cambridge, GB) [dblp]
- Silvia Puglisi (The Tor Project - Seattle, US) [dblp]
- Margo Seltzer (Harvard University - Cambridge, US) [dblp]
- Jatinder Singh (University of Cambridge, GB) [dblp]
- Barbara Staudt Lerner (Mount Holyoke College - South Hadley, US) [dblp]
- Michael Veale (University College London, GB) [dblp]
- Ben Wagner (Wirtschaftsuniversität Wien, AT)
- Michael Winikoff (University of Otago, NZ) [dblp]
- Martina Zitterbart (KIT - Karlsruher Institut für Technologie, DE) [dblp]
- security / cryptology
- society / human-computer interaction
- world wide web / internet
- security and privacy
- law and regulation
- cloud computing
- Internet of Things
- compliance and audit