- On-Body Interaction: Embodied Cognition Meets Sensor/Actuator Engineering to Design New Interfaces (Dagstuhl Seminar 18212). Kasper Hornbaek, David Kirsh, Joseph A. Paradiso, and Jürgen Steimle. In Dagstuhl Reports, Volume 8, Issue 5, pp. 80-101, Schloss Dagstuhl - Leibniz-Zentrum für Informatik (2018)
On-body technologies are emerging as a new paradigm in human-computer interaction. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with large displays, or touch their forearm to control a mobile phone. First promising applications are being investigated or have been demonstrated in mobile computing, healthcare, or sports.
Two areas of research have been contributing to this paradigm. Research on embodied cognition suggests that the body should no longer be treated as a passive actuator of input devices but as something that needs to be carefully designed for and as something that offers unique new possibilities in interaction. Embodied cognition has become a prominent candidate for outlining what we can and cannot do in on-body interaction. Research on interactive technologies for the body is opening up new avenues for human-computer interaction, by contributing body-based sensing input and output modalities with more body compatible form factors. Together, these areas allow the design and implementation of new user interfaces; however, they are rarely in direct contact with each other.
The intended outcome of the seminar is a research agenda for on-body technologies based on synergies between these two views. We therefore aim to bring together a group of researchers from embodied cognition (including psychology, robotics, human-computer interaction, and sociology) as well as sensor/actuator engineering (including computer science, materials science, electrical engineering). We aim to have these groups outline a research agenda for on-body technologies, in part using a bottom-up process at the seminar, in part using structured answers to questions in advance of the seminar. Key topics for discussion include, but are not limited to, (1) advances in on-body sensors and actuators, in particular how to drive the technical development from work on embodied cognition and the body, (2) cognitive consequences of on-body technologies, (3) how to take the peculiarities and possibilities of the body into consideration, (4) how to evaluate on-body technology, and (5) application areas of on-body technologies. To share the results with the larger research community, we aim to outline a special issue of a journal or an edited book on body-based user interfaces, spanning the disciplines and topic areas mentioned above. We find that the time is right to bring together a multi-disciplinary group of researchers to map out a research agenda and to understand how experts from different fields will need to work in tandem to develop these novel on-body interfaces.
For the past 40 years, input to computers has been given with mouse and keyboard. Over the last decade, multi-touch has become popular for small devices (e.g., phones and tablets) as well as for large displays (e.g., interactive tabletops and wall-sized screens). All these forms of input require the user to hold or touch a device. Conversely, output has happened on large screens external to the body (e.g., a desktop) or small ones on the body (e.g., smartwatches). The field of human-computer interaction (HCI) has worked to understand these user interfaces (UIs) and how people use them, in addition to establishing principles of design and models of performance to help design them so that they are useful and usable.
Recently, however, HCI researchers have been interested in allowing new forms of on-body technologies. One vision is to integrate technology with the body so as to use and supplement its capabilities. In particular, researchers have focused on sensing users' movement and gestures, aiming to allow users to interact using their body rather than by using a device. Early work included Bolt's put-that-there system developed in the late 1970s, and recent advances in computer vision have allowed the tracking of users' hands, arms, and bodies, leading to a flurry of motion-based gaming controls and inventive, body-based games. The number and variety of research prototypes of non-device UIs have also exploded over the past few years, showing how movements in front of a large display can control navigation, how users can gesture in mid-air, how scratching or poking the skin of one's forearm can be a means of input, and how electric muscle stimulation can be used to move users' limbs as output. Further, HCI researchers have been exploring the theoretical opportunities in using the body for interaction, describing principles for whole-body interaction, embodied interaction, and body-centric interaction, as well as highlighting some of the philosophical and psychological challenges associated with using the body as an interface. First promising applications are being investigated or have been demonstrated in mobile computing, healthcare, or sports. A new UI paradigm seems to be emerging.
The main objective of the seminar was to explore on-body interaction through two research areas: embodied cognition and sensor/actuator engineering. The former has driven a lot of thinking and models around on-body technologies and the potential of body-based interaction. The latter has been behind many of the sensors and actuators that have enabled prototypes to be built and to demonstrate the potential of on-body technology. We did this bringing together a group of researchers from embodied cognition (including psychology, robotics, human-computer interaction, art/design, and sociology) as well as sensor/actuator engineering (including computer science, materials science, electrical engineering). Second, we had this diverse group of researchers outline a research agenda for on-body technologies, in part using a bottom-up process at the seminar, in part using structured answers to questions in advance of the seminar.
In line with the objectives above, the seminar focused on three areas of investigation:
- Embodied Cognition: Embodied cognition is a term covering research in linguistics, robotics, artificial intelligence, philosophy, and psychology (e.g., Anderson 2003, Wilson 2002). The core idea in embodied cognition is that our bodies shape thinking broadly understood (including reasoning, memory, and emotion). In contrast to most psychological foundations of HCI, embodied cognition argues that one cannot study the human as a system comprising input (senses), processing (thinking), and output (motor activity), because sensor-motor activity affects thinking fundamentally and, conversely but less radically, because our body reflects more about our thinking than is commonly expected. Thus, bodies and thinking are intertwined, as reflected in embodied cognition book titles like "How the Body Shapes the Way We Think"  and "How the Body Shapes the Mind" . Embodied cognition has become a prominent candidate for outlining what we can and cannot do in on-body interaction.
- Sensor/Actuator Engineering: The engineering of technologies that transform the human body into an interface is a very active research area. A widely used approach uses techniques from visual computing for capturing body gestures and touch input on the body using RGB or depth cameras, while projecting visual output with a body-worn projector. Other approaches build on the transdermal propagation of ultrasound or electromagnetic waves to identify the location of touch contact on human skin. EMG can be used to capture human muscle movement, while Electrical Muscle Stimulation can generate muscle output. Radar is another technology that has been successfully demonstrated very recently for capturing gestural input. A further recent strand in research uses slim skin electronics for sensing and output on the body. These technologies are opening up new avenues for human-computer interaction, by contributing body-based sensing input and output modalities with an increasing resolution and more body compatible form factors.
- New On-Body Technologies: This area concerns how we can combine embodied cognition and sensor/actuator engineering to design on-body technologies. The design of on-body technologies was a key discussion topic, in particular, how to drive the technical development from work on embodied cognition and the body, how to evaluate on-body technology, and how to take the peculiarities and possibilities of the body into consideration. The application areas of on-body technologies were another consideration.
The first day of the seminar was reserved for presentations, to establish common ground for discussions. All participants introduced themselves, their background, and their vision in short position talks.
Four long talks reviewed the state-of-the-art and presented recent work in key areas. In his talk "Embodied Cognition: What does having a body gives us?", David Kirsh emphasized on four topics: Effectivity, Enactive perception, Interactive Cognition, and Experience. They all explore what having a body gives us that goes beyond just having a sensor in space. Katia Vega's talk, entitled "Beauty Technologies", focused on the possibilities to embed technology on and inside the skin. Nadia Bianchi-Berthouze gave a talk entitled "The Affective Body in Interaction", discussing the high-level principles of affective computing and creating body-affective-aware-computing technology, which involves sensing the affect and emotion of the users and using them for interaction. In his talk "smetic Computing: Actions and Urgencies towards an Inclusive, Equitable Landscape of On-Body Technologies", Eric Paulos urged the need for transdisciplinary and interdisciplinary approaches and proposed a framing around "Cosmetic Computing".
The evening featured a demo session. An impressive total number of 8 interactive demos and exhibits were demonstrated in the historical ambiance of the Rokoko-style music hall. Those demos comprised, amongst other, e-textiles, interactive tattoos and make-up, new bio-inspired materials and tactile actuation technologies.
The second day consisted of work in breakout groups. First, groups identified challenges for future work in the field of on-body interaction, grouped into four main areas: Integration of the body and the device; Cognition and Affect; Interaction; and Applications. Next, the participants worked together to identify positive visions of a future with body-based interfaces. Promising aspects that were identified include sensory augmentation of human body for graceful ageing, personalized medication and the idea of legal/democratic framework for controlling wearable technology. To identify potential risks associated with body-based technologies and interaction, the group also developed negative visions. Key problems and risks that were identified include a loss of physical embodiment and substantial security risks of our bodies (and potentially even emotions) being externally controlled.
In an session, entitled academic speed-dating, we randomly paired two participants with each other. Their goal was to developed within 7 minutes an idea and a title for a paper they would write together. The format turned out to be very well-received and to stimulate research ideas at unforeseen intersections between the participants' interest and expertise.
The seminar set out to bring together diverse researchers to discuss the overlap between embodied cognition and sensor/actuator engineering. The group managed to cover advances in on-body sensors and actuator, some of the cognitive consequences of on-body technologies, and open issues in applications of on-body technologies. Further, a range of open questions and exciting research questions were discussed, which will likely foster future collaboration and serve as a generator of future research on on-body technologies.
- Shaun Gallagher. How the body shapes the mind. Clarendon Press, 2006.
- Rolf Pfeifer and Josh Bongard. How the body shapes the way we think: a new view of intelligence. MIT press, 2006.
- Eduard Arzt (INM - Saarbrücken, DE) [dblp]
- Nadia Bianchi-Berthouze (University College London, GB) [dblp]
- Liwei Chan (National Chiao-Tung University - Hsinchu, TW) [dblp]
- Clément Duhart (MIT - Cambridge, US) [dblp]
- Michael Haller (University of Applied Sciences Upper Austria, AT) [dblp]
- Nur Hamdan (RWTH Aachen, DE)
- Kayla J. Heffernan (The University of Melbourne, AU) [dblp]
- Christian Holz (Microsoft Research - Redmond, US) [dblp]
- Kasper Hornbaek (University of Copenhagen, DK) [dblp]
- David Kirsh (University of California - San Diego, US) [dblp]
- Antonio Krüger (DFKI - Saarbrücken, DE) [dblp]
- Pedro Lopes (Hasso-Plattner-Institut - Potsdam, DE) [dblp]
- Paul Lukowicz (DFKI - Kaiserslautern, DE) [dblp]
- Karon MacLean (University of British Columbia - Vancouver, CA) [dblp]
- Aditya Shekhar Nittala (Universität des Saarlandes, DE) [dblp]
- Joseph A. Paradiso (MIT - Cambridge, US) [dblp]
- Eric Paulos (University of California - Berkeley, US) [dblp]
- Anastasia Pavlidou (MPI für biologische Kybernetik - Tübingen, DE)
- Henning Pohl (University of Copenhagen, DK) [dblp]
- Ivan Poupyrev (Google Inc. - Mountain View, US) [dblp]
- Joan Sol Roo (Universität des Saarlandes, DE) [dblp]
- Chris Schmandt (MIT - Cambridge, US) [dblp]
- Albrecht Schmidt (LMU München, DE) [dblp]
- Jürgen Steimle (Universität des Saarlandes, DE) [dblp]
- Paul Strohmeier (University of Copenhagen, DK) [dblp]
- Katia Vega (University of California - Davis, US) [dblp]
- society / human-computer interaction
- Human-Computer interaction
- embodied cognition
- user interface software and technology