July 28 – August 2 , 2013, Dagstuhl Seminar 13312
"My Life, Shared" - Trust and Privacy in the Age of Ubiquitous Experience Sharing
Alessandro Acquisti (Carnegie Mellon University, US)
Ioannis Krontiris (Goethe-Universität Frankfurt am Main, DE)
Marc Langheinrich (University of Lugano, CH)
Martina Angela Sasse (University College London, GB)
1 / 2 >
For support, please contact
Advancements in smart phones and sensing technology have bolstered the creation and exchange of user generated content, resulting in new information flows and data-sharing applications. Through such applications, personal mobile devices are used to uncover and share previously private elements of people's own everyday experiences. Examples include using smartphones or wearable sensors to collect and share context information (e.g., activities, social context, sports performance, dietary or health concerns). These flows of personal information have two distinct characteristics: they happen seamlessly (in real time, without necessarily the conscious participation of the user), and they are shared with a user's family, social circles, or even publicly.
This new paradigm repositions individuals as producers, consumers, and remixers of a vast set of data with potential many economic and societal benefits. However, as sharing practices become more fluid than in desktop-based online environments, control over personal information flows becomes harder to maintain.
The goal of Dagstuhl Seminar 13312 "My Life, Shared" -- Trust and Privacy in the Age of Ubiquitous Experience Sharing was to advance a research agenda in trust and privacy that addresses not only the evolution of the pervasive technologies underlying these trends (e.g., smartphones, wearable sensors), but also the surrounding societal and economic context, and to identify the resulting qualitative changes to the privacy landscape.
With that in mind, the seminar created an interdisciplinary discussion forum and a set of organised presentations around four broad areas: 1) tools and protocols, 2) usability and control tools, 3) behavioural decisions, and 4) social implications. Each area saw a selected set of participants present their work and views in the context of a short presentation, followed by an in-depth discussion session. From these discussions the organizers collected the main challenges and opportunities, and grouped them around four major themes: "Personal Data Services", "Social Justice", "Tool Clinics", and "Consequence-based Privacy Decision-making". Each theme was subsequently discussed during one and a half days in four individual working groups, which presented their findings at the end of the seminar.
This report not only contains the abstracts of the initial presentations (section 3) but also the findings of the four thematic working groups. Below we summarize the main findings from these working groups -- a more analytical description can be found in section 5.
Theme 1: Personal Data Service (PDS)
A "Personal Data Service (PDS)" represents a trusted container for aggregating, storing, processing and exporting personal data. In principle, all data regarding the user (either user-generated or obtained from other sources, e.g. service providers) should be accessible to this container, including data about the user collected and published by others. Users are in control of all data stored in the PDS, which includes the option to share or sell parts of this data. In addition to storing data, the PDS can execute code to process this data locally.
By considering both a household- and a health-related scenario, the working group identified some of its properties and functionalities and sketched a possible system architecture that would include such a container. In a detailed discussion of benefits and risks, the working group concluded that there were still several issues to be investigated and real challenges that needed to be addressed before a PDS framework could be implemented and deployed, such as:
- Creating incentives to initial data providers to engage and open up the personal data APIs that are needed to fuel the PDS and associated applications.
- Creating utility from stored data: data fusion, sense making, and visualization that will lead to meaningful and actionable and sustainable engagement of the end user with their data.
- Addressing privacy: even though the PDS can increase transparency, awareness and engagement of users with their data, it is neither obvious nor guaranteed that PDS will resolve user privacy problems and several of them remain open.
Theme 2: Social Justice
Privacy issues in participatory sensing are symptoms of broader concerns about the impact of sensing on social justice. Framing a social justice research agenda for participatory sensing requires the operationalization of concepts like fairness, human flourishing, structural change, and balances of power for system design, use, and regulation. The working group discussed how one might begin to operationalize these concepts for the design of data collection features, processing, sharing, and user interfaces. The group developed an analysis tool -- a social justice impact assessment -- to help system designers consider the social justice implications of their work during the design phase. The participants identified and presented several open questions that could spark future research, such as:
- If one assumes that participatory sensing will lead to greater transparency, will such transparency equally impact individuals, powerful people, and institutions?
- Do the powerful always end up subverting transparency schemes? Or can sensing change that tendency, for example by making facts visible to consumers and citizens, enabling organized responses (unionization)?
- What are the forums for encouraging collective action in participatory sensing? Can one encourage system designers to consider social justice during design by framing design as a collective action problem? Can participatory sensing open new avenues for consumers and citizens to organize collective action?
Theme 3: Tool Clinics
Privacy researchers and practitioners are working largely in isolation, concentrating on people's use of different user interfaces for privacy control, largely ignoring existing cross-disciplinary collaboration techniques. A ``tool clinic'' could encourage a collaborative (re)consideration of a technological solution, research technique or other artefact, in order to critically assess its design, development and deployment from multiple perspectives. A tool clinic can be used to provide a setting for those who are developing the solutions to rethink the framing and presentation of their solutions. The objective is to reflect from different perspectives on practices around the development, encoding, use, domestication, decoding and sustainability of a tool to gain quasi-ecological validation. The working group recommended to develop a tool clinic as a new event format for a scientific conference, ideally at a renowned computer-science conference. This would combine the tool-centric nature of a demo session, the protected space of work-in-progress afforded by a workshop, and the mentoring spirit of a doctoral workshop. The format of a tool clinic session could typically consist of three steps:
- Identifying particular affordances of the technological solution, research technique or other artefact and possible (unintended) consequences for people and society;
- Gathering perspectives and practices of different experts, disciplines and/or stakeholders (e.g. users, policy makers, industry, etc.) linked with the development, deployment and sustainable evolution of a particular tool, solution, technique or artefact;
- Informing and advising on technological design of the tool or solution, in order to avoid negative consequence and to further positive outcome.
Theme 4: Consequence-based Privacy Decisions
Recent research shows that people not only want to control their privacy but are actually trying to do so. An appropriate privacy-respectful user interface should thus show users the consequences of making different privacy choices, rather than framing the choices only in technical terms regarding system parameters, which users often do not understand and do not care about. Providing tools to increase user comprehension of potential consequences is one of the next big challenges to be addressed in the field of privacy respectful user interfaces. In addition to helping users make better choices in terms of privacy protection, this will also allow them to make better informed decisions and hence, implement the notion of informed consent. The attempt to develop user interaction in this direction requires research on a number of issues that have so far received relatively little attention and concern, such as:
- Expression of potential consequences: The consequences should be expressed in a way that is comprehensible by different user categories from novices to experts.
- Decision support: Users could be further helped in their privacy decisions by external information sources. Studies to determine the responses to different kinds of information sources, different formats, and information from different groups of users will be necessary.
- Minimal effort: Introducing additional tools to help users make informed decisions may add significant overhead to the interaction. While this overhead may be the price to pay for better privacy protection, it should be limited to the minimum.
Creative Commons BY 3.0 Unported license
Alessandro Acquisti and Ioannis Krontiris and Marc Langheinrich and Martina Angela Sasse
- Security / Cryptology
- Society / Human-computer Interaction
- World Wide Web / Internet
- Data loss prevention
- Informational self-determination
- Web 2.0
- Mobile Internet