Most approaches in Human-Computer Interaction follow the ideal of embodied interaction. However, more and more technologies evolve, such as chatbots, smart voice interfaces, and domestic or social robots, that imply a fundamentally different relationship between human and technology. This “otherware” presents itself either incidentally or by design as computational counterpart rather than as embodied extension of the Self. The predominant strategy to design form and interaction with otherware is to mimic humans or animals (i.e., naïve anthropomorphism or zoomorphism). While this strategy has some advantages, we call for exploring an alternative, namely to cultivate the otherness of computational counterparts rather than to mimic existing lifeforms.
The workshop will bring together computer scientists, psychologists, designers and artists to speculate on alternative models of interacting with otherware and appropriate forms of otherness. It lays the foundation for a more nuanced perspective on how to design the interaction with computational counterparts besides embodied interaction.
The aim of the present workshop is to initiate a research and design network in which the HCI community can actively participate, contribute to and deepen research on otherware, especially from an interaction perspective.
What are alternatives to naïve anthropomorphism and zoomorphism? How should computational counterparts look, behave, and communicate? What are beneficial application areas of otherware?
As an interdisciplinary research community, HCI provides several perspectives on such a topic. Together, we want to speculate on appropriate forms of otherness for otherware. We hope to lay the foundation for a more nuanced perspective, and to debate on how to design interactions with computational counterparts besides the ideal of embodied interaction.
Contribution deadline: 01.09.2020 extended to 10.09.2020 (AoE) Acceptance notification: 15.09.2020
Workshop date: 26.10.2020 from 10:00 to 15:00 (CET)
Most approaches to designing interaction in Human-Computer Interaction (HCI), such as “Direct Manipulation” , “Embodied Interaction” , “Tangible Computing” , “Soma-Based Design”  or “Human-Computer Integration”  follow similar ideals. They focus on people, understand technology as a form of extension of minds and bodies, and tend to design technology to literally “disappear” in use. In Don Ihde’s  terms: HCI aims for people to have an “embodiment relationship” with technology.
At the same time, self-learning, self-reliant and proactive computational artifacts are on the rise. Technology such as AI-powered conversational interfaces, smart voice interfaces (e.g., Alexa, Siri), robotic vacuum cleaners or even social robots will continue to evolve and will inevitably shape individual experiences and society. In contrast to the ideal of embodiment relations in HCI, these artifacts are in a dialog with their users and do not necessarily extend them. They are, either by incident or by design, perceived as counterparts and imply an “alterity relation” . It seems simply impossible to experience an anthropomorphic robot, such as Softbank’s Pepper, as an extension of one’s Self rather than as a self-reliant counterpart.
Obviously, counterpart technologies, or what we call otherware, require a different approach to interaction design than embodied technologies. So far, the prevailing approach is to mimic humans or animals, both in form and in interaction (i.e., anthropomorphism, zoomorphism). This fixation on a rather naïve anthropomorphism comes with advantages (i.e., an intuitive interaction borrowed from human-human or human-animal interaction) but also many disadvantages, such as reinforcing inappropriate gender stereotypes , or influencing the manners of children in yet unknown ways .
Technology in the form of quasi-lifeforms that pretend to have motives and emotions can be deeply disturbing and “uncanny”. In addition, naïve anthropomorphism might even be a barrier to unlock exciting potentials of otherware [2,15]. Welge and colleagues , for example, argued that robots have social superpowers, such as endless patience, just because of their mechanistic nature. In this case, a quality most people find important in social interaction is actually hard to attain for humans but easy for computational counterparts.
One might argue that otherware in general is a bad idea and should simply be replaced by embodied technologies. This is certainly true for voice interfaces, such as Alexa, where many interactions (e.g., switching on the light) could just be replaced by more traditional interactions. However, there are potential uses for technology that appears as a counterpart, yet shows different qualities compared to a human. Examples are motivational and persuasive application areas, such as virtual coaches , social robots to ease loneliness , music machines to stimulate creativity , therapeutic settings involving self-disclosure, or even the spiritual .
We argue that unlocking the powers of otherware requires a design approach different from naïve anthropomorphism or zoomorphism—an approach that keeps the alterity relation intact, yet clearly communicates the counterpart as different from humans or animals . In other words, otherware needs to cultivate the otherness of machines in its design and interaction. Unique capabilities, such as endless willpower when trying to achieve a specific goal, or endless patience and interest leading boring conversations should be at the heart of future otherware designs.
References Sheryl Brahnam and Antonella De Angeli. 2012. Gender affordances of conversational agents. Interacting with Computers 24, 3: 139–153. Judith Dörrenbächer, Diana Löffler, and Marc Hassenzahl. 2020. Becoming a Robot – Overcoming Anthropomorphism with Techno-Mimesis. CHI 2020, April 25–30, 2020, Honolulu, HI, USA: 1–12. Paul Dourish. 2001. Where the action is: the foundations of embodied interaction. The MIT Press, Cambridge, MA, USA. Horst-Michael Gross et al. 2019. Living with a Mobile Companion Robot in your Own Apartment - Final Implementation and Results of a 20-Weeks Field Study with 20 Seniors*. In IEEE International Conference on Robotics and Automation (ICRA), 2253–2259. Kristina Höök, Martin P. Jonsson, Anna Ståhl, and Johanna Mercurio. 2016. Somaesthetic Appreciation Design. In SIGCHI Conference on Human Factors in Computing Systems - CHI ’16, 3131–3142. Don İhde. 1990. Technology and the lifeworld: From garden to earth. Indiana University Press, Indianapolis, IN, USA. Hiroshi Ishii. 2008. The tangible user interface and its evolution. Communications of the ACM. Matthias Laschke, Robin Neuhaus, Marc Hassenzahl, and Claudius Lazzeroni. 2020. Improvising with Machines – Designing Artistic Non-Human Actors. In In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA ’20), 1–7. Diana Löffler et al. 2020. Hybridity as Design Strategy for Service Robots to Become Domestic Products. CHI 2020, April 25–30, 2020, Honolulu, HI, USA: 1–8. Diana Löffler et al. 2019. Blessing Robot BlessU2: A Discursive Design Study to Understand the Implications of Social Robots in Religious Contexts. International Journal of Social Robotics. Florian Floyd Mueller et al. 2020. Next Steps in Human-Computer Integration. In CHI 2020, April 25–30, 2020, Honolulu, HI, USA, 1–15. Benedikt Schmidt, Rüdiger Eichin, Sebastian Benchea, and Christian Meurisch. 2015. Fitness tracker or digital personal coach: How to personalize training. UbiComp and ISWC 2015 - Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and the Proceedings of the 2015 ACM International Symposium on Wearable Computers: 1063–1068. Alex Sciuto, Arnita Saini, Jodi Forlizzi, and Jason I. Hong. 2018. “Hey Alexa, What’s Up?” 857–868. Ben Shneiderman. 1982. The future of interactive systems and the emergence of direct manipulation. Behaviour and Information Technology 1, 3: 237–256. Julika Welge and Marc Hassenzahl. 2016. Better than human: About the psychological superpowers of robots. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 993–1002.
If you have questions or other enquiries, just get in touch!