Dagstuhl Seminar 13312
"My Life, Shared" – Trust and Privacy in the Age of Ubiquitous Experience Sharing
( Jul 28 – Aug 02, 2013 )
- Alessandro Acquisti (Carnegie Mellon University, US)
- Ioannis Krontiris (Goethe-Universität Frankfurt am Main, DE)
- Marc Langheinrich (University of Lugano, CH)
- Martina Angela Sasse (University College London, GB)
- Annette Beyer (for administrative matters)
- 'My Life, Shared' - Trust and Privacy in the Age of Ubiquitous Experience Sharing (Dagstuhl Seminar 13312). Alessandro Acquisti, Ioannis Krontiris, Marc Langheinrich, and Martina Angela Sasse. In Dagstuhl Reports, Volume 3, Issue 7, pp. 74-107, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2013)
Advancements in smartphones and sensing technology have bolstered the creation and exchange of user generated content, resulting in new information flows and data-sharing applications. Through such applications, personal mobile devices are used to uncover and share previously private elements of people’s own everyday experiences. Examples include using smartphones or wearable sensors to share context information (activities, social context, sport’s performance, dietary or health concerns, etc.). These flows of personal information have two distinct characteristics: they happen seamlessly (in real time, without necessarily the conscious participation of the user), and they are shared with other people of the same social circle or even beyond that.
This new paradigm repositions individuals as producers, consumers, and re-mixers of a vast set of data with potential many economic and societal benefits. On the other hand, as sharing practices become more fluid than in desktop-based online environments, control over personal information flows becomes harder to maintain.
The goal of this seminar is to advance a research agenda in privacy that addresses not only the evolution of the pervasive technologies underlying these trends (e.g., smartphones, wearable sensors), but also the surrounding societal and economic context, and to identify the resulting qualitative changes to the privacy landscape. In such environments - where it is hard to define static, concrete groups of trusted recipients - enabling individuals to control to whom they disclose their personal data requires investigating the risk/benefit trade-off associated with any kind of information disclosure, as perceived by the mobile device owners. The risk is associated with the trust placed in the various information receivers, while the benefits depend on the perceived value of the corresponding disclosure incentives.
Towards this goal, the seminar will bring together scientists from different disciplines to tackle concrete research questions, including the following:
- Which new collection and sharing practices of personal data emerge from the widespread adoption of mobile devices and their capabilities?
- Which new privacy threats result from this, and what are the limitations of applying existing privacy enhancing technologies to mitigate them?
- Who are the stakeholders involved in this process, and how can their costs and benefits be balanced, given their conflicting interests?
- What new types of feedback and control tools are needed and what privacy user interfaces do we need to design?
- Is it possible that new technological approaches can help to reconcile individual privacy interests and sustainable business models?
The Dagstuhl Seminar will create an interdisciplinary discussion forum with the additional long-term goal to stimulate research in the various changes that these new ubiquitous services induce both in people’s perceived privacy concerns and in their actual behavior. The results of these discussions may inform legislation and regulation of privacy on one hand, and the technology for enforcing the user’s privacy preferences on the other hand, given the needs of both domains to keep up with data collection practices in pervasive environments.
Advancements in smart phones and sensing technology have bolstered the creation and exchange of user generated content, resulting in new information flows and data-sharing applications. Through such applications, personal mobile devices are used to uncover and share previously private elements of people's own everyday experiences. Examples include using smartphones or wearable sensors to collect and share context information (e.g., activities, social context, sports performance, dietary or health concerns). These flows of personal information have two distinct characteristics: they happen seamlessly (in real time, without necessarily the conscious participation of the user), and they are shared with a user's family, social circles, or even publicly.
This new paradigm repositions individuals as producers, consumers, and remixers of a vast set of data with potential many economic and societal benefits. However, as sharing practices become more fluid than in desktop-based online environments, control over personal information flows becomes harder to maintain.
The goal of Dagstuhl Seminar 13312 "My Life, Shared" -- Trust and Privacy in the Age of Ubiquitous Experience Sharing was to advance a research agenda in trust and privacy that addresses not only the evolution of the pervasive technologies underlying these trends (e.g., smartphones, wearable sensors), but also the surrounding societal and economic context, and to identify the resulting qualitative changes to the privacy landscape.
With that in mind, the seminar created an interdisciplinary discussion forum and a set of organised presentations around four broad areas: 1) tools and protocols, 2) usability and control tools, 3) behavioural decisions, and 4) social implications. Each area saw a selected set of participants present their work and views in the context of a short presentation, followed by an in-depth discussion session. From these discussions the organizers collected the main challenges and opportunities, and grouped them around four major themes: "Personal Data Services", "Social Justice", "Tool Clinics", and "Consequence-based Privacy Decision-making". Each theme was subsequently discussed during one and a half days in four individual working groups, which presented their findings at the end of the seminar.
This report not only contains the abstracts of the initial presentations (section 3) but also the findings of the four thematic working groups. Below we summarize the main findings from these working groups -- a more analytical description can be found in section 5.
Theme 1: Personal Data Service (PDS)
A "Personal Data Service (PDS)" represents a trusted container for aggregating, storing, processing and exporting personal data. In principle, all data regarding the user (either user-generated or obtained from other sources, e.g. service providers) should be accessible to this container, including data about the user collected and published by others. Users are in control of all data stored in the PDS, which includes the option to share or sell parts of this data. In addition to storing data, the PDS can execute code to process this data locally.
By considering both a household- and a health-related scenario, the working group identified some of its properties and functionalities and sketched a possible system architecture that would include such a container. In a detailed discussion of benefits and risks, the working group concluded that there were still several issues to be investigated and real challenges that needed to be addressed before a PDS framework could be implemented and deployed, such as:
- Creating incentives to initial data providers to engage and open up the personal data APIs that are needed to fuel the PDS and associated applications.
- Creating utility from stored data: data fusion, sense making, and visualization that will lead to meaningful and actionable and sustainable engagement of the end user with their data.
- Addressing privacy: even though the PDS can increase transparency, awareness and engagement of users with their data, it is neither obvious nor guaranteed that PDS will resolve user privacy problems and several of them remain open.
Theme 2: Social Justice
Privacy issues in participatory sensing are symptoms of broader concerns about the impact of sensing on social justice. Framing a social justice research agenda for participatory sensing requires the operationalization of concepts like fairness, human flourishing, structural change, and balances of power for system design, use, and regulation. The working group discussed how one might begin to operationalize these concepts for the design of data collection features, processing, sharing, and user interfaces. The group developed an analysis tool -- a social justice impact assessment -- to help system designers consider the social justice implications of their work during the design phase. The participants identified and presented several open questions that could spark future research, such as:
- If one assumes that participatory sensing will lead to greater transparency, will such transparency equally impact individuals, powerful people, and institutions?
- Do the powerful always end up subverting transparency schemes? Or can sensing change that tendency, for example by making facts visible to consumers and citizens, enabling organized responses (unionization)?
- What are the forums for encouraging collective action in participatory sensing? Can one encourage system designers to consider social justice during design by framing design as a collective action problem? Can participatory sensing open new avenues for consumers and citizens to organize collective action?
Theme 3: Tool Clinics
Privacy researchers and practitioners are working largely in isolation, concentrating on people's use of different user interfaces for privacy control, largely ignoring existing cross-disciplinary collaboration techniques. A ``tool clinic'' could encourage a collaborative (re)consideration of a technological solution, research technique or other artefact, in order to critically assess its design, development and deployment from multiple perspectives. A tool clinic can be used to provide a setting for those who are developing the solutions to rethink the framing and presentation of their solutions. The objective is to reflect from different perspectives on practices around the development, encoding, use, domestication, decoding and sustainability of a tool to gain quasi-ecological validation. The working group recommended to develop a tool clinic as a new event format for a scientific conference, ideally at a renowned computer-science conference. This would combine the tool-centric nature of a demo session, the protected space of work-in-progress afforded by a workshop, and the mentoring spirit of a doctoral workshop. The format of a tool clinic session could typically consist of three steps:
- Identifying particular affordances of the technological solution, research technique or other artefact and possible (unintended) consequences for people and society;
- Gathering perspectives and practices of different experts, disciplines and/or stakeholders (e.g. users, policy makers, industry, etc.) linked with the development, deployment and sustainable evolution of a particular tool, solution, technique or artefact;
- Informing and advising on technological design of the tool or solution, in order to avoid negative consequence and to further positive outcome.
Theme 4: Consequence-based Privacy Decisions
Recent research shows that people not only want to control their privacy but are actually trying to do so. An appropriate privacy-respectful user interface should thus show users the consequences of making different privacy choices, rather than framing the choices only in technical terms regarding system parameters, which users often do not understand and do not care about. Providing tools to increase user comprehension of potential consequences is one of the next big challenges to be addressed in the field of privacy respectful user interfaces. In addition to helping users make better choices in terms of privacy protection, this will also allow them to make better informed decisions and hence, implement the notion of informed consent. The attempt to develop user interaction in this direction requires research on a number of issues that have so far received relatively little attention and concern, such as:
- Expression of potential consequences: The consequences should be expressed in a way that is comprehensible by different user categories from novices to experts.
- Decision support: Users could be further helped in their privacy decisions by external information sources. Studies to determine the responses to different kinds of information sources, different formats, and information from different groups of users will be necessary.
- Minimal effort: Introducing additional tools to help users make informed decisions may add significant overhead to the interaction. While this overhead may be the price to pay for better privacy protection, it should be limited to the minimum.
- Alessandro Acquisti (Carnegie Mellon University, US) [dblp]
- Mads Schaarup Andersen (Aarhus University, DK) [dblp]
- Zinaida Benenson (Universität Erlangen-Nürnberg, DE) [dblp]
- Bettina Berendt (KU Leuven, BE) [dblp]
- Claudio Bettini (University of Milan, IT) [dblp]
- Rainer Böhme (Universität Münster, DE) [dblp]
- Ian Brown (University of Oxford, GB) [dblp]
- Claude Castelluccia (INRIA - Grenoble, FR) [dblp]
- Delphine Christin (TU Darmstadt, DE) [dblp]
- Alexander De Luca (LMU München, DE) [dblp]
- Tassos Dimitriou (Athens Information Technology, GR) [dblp]
- Frank Dürr (Universität Stuttgart, DE) [dblp]
- Deborah Estrin (Cornell Tech NYC, US) [dblp]
- Simone Fischer-Hübner (Karlstad University, SE) [dblp]
- Michael Friedewald (Fraunhofer ISI - Karlsruhe, DE) [dblp]
- Raghu K. Ganti (IBM TJ Watson Research Center - Yorktown Heights, US) [dblp]
- Jens Grossklags (Pennsylvania State University, US) [dblp]
- Seda F. Gürses (KU Leuven, BE) [dblp]
- Thomas Heimann (Google - München, DE)
- Ioannis Krontiris (Goethe-Universität Frankfurt am Main, DE) [dblp]
- Marc Langheinrich (University of Lugano, CH) [dblp]
- René Mayrhofer (University of Applied Sciences Upper Austria, AT) [dblp]
- Joachim Meyer (Tel Aviv University, IL) [dblp]
- Anthony Morton (University College London, GB) [dblp]
- David Phillips (University of Toronto, CA) [dblp]
- Jo Pierson (Free University of Brussels, BE) [dblp]
- Sören Preibusch (Microsoft Research UK - Cambridge, GB) [dblp]
- Kai Rannenberg (Goethe-Universität Frankfurt am Main, DE) [dblp]
- Norman Sadeh (Carnegie Mellon University - Pittsburgh, US) [dblp]
- Martina Angela Sasse (University College London, GB) [dblp]
- Marcello Paolo Scipioni (University of Lugano, CH) [dblp]
- Katie Shilton (University of Maryland - College Park, US) [dblp]
- Sarah Spiekermann-Hoff (Universität Wien, AT) [dblp]
- security / cryptology
- society / human-computer interaction
- world wide web / internet
- data loss prevention
- informational self-determination
- Web 2.0
- mobile Internet