Otherware needs Otherness: Understanding and Designing Artificial Counterparts
Workshop at NordiCHI 2020 | 26.10.2020 from 10:00 to 15:00 (CEST)
Most approaches in Human-Computer Interaction follow the ideal of embodied interaction. However, more and more technologies evolve, such as chatbots, smart voice interfaces, and domestic or social robots, that imply a fundamentally different relationship between human and technology. This “otherware” presents itself either incidentally or by design as computational counterpart rather than as embodied extension of the Self. The predominant strategy to design form and interaction with otherware is to mimic humans or animals (i.e., naïve anthropomorphism or zoomorphism). While this strategy has some advantages, we call for exploring an alternative, namely to cultivate the otherness of computational counterparts rather than to mimic existing lifeforms.
The workshop will bring together computer scientists, psychologists, designers and artists to speculate on alternative models of interacting with otherware and appropriate forms of otherness. It lays the foundation for a more nuanced perspective on how to design the interaction with computational counterparts besides embodied interaction.
The aim of the present workshop is to initiate a research and design network in which the HCI community can actively participate, contribute to and deepen research on otherware, especially from an interaction perspective.
What are alternatives to naïve anthropomorphism and zoomorphism? How should computational counterparts look, behave, and communicate? What are beneficial application areas of otherware?
As an interdisciplinary research community, HCI provides several perspectives on such a topic. Together, we want to speculate on appropriate forms of otherness for otherware. We hope to lay the foundation for a more nuanced perspective, and to debate on how to design interactions with computational counterparts besides the ideal of embodied interaction.
Important dates
Contribution deadline: 01.09.2020extendedto 10.09.2020 (AoE)Acceptance notification: 15.09.2020Workshop date: 26.10.2020 from 10:00 to 15:00 (CET)
Background
Most approaches to designing interaction in Human-Computer Interaction (HCI), such as “Direct Manipulation” [14], “Embodied Interaction” [3], “Tangible Computing” [7], “Soma-Based Design” [5] or “Human-Computer Integration” [11] follow similar ideals. They focus on people, understand technology as a form of extension of minds and bodies, and tend to design technology to literally “disappear” in use. In Don Ihde’s [6] terms: HCI aims for people to have an “embodiment relationship” with technology.
At the same time, self-learning, self-reliant and proactive computational artifacts are on the rise. Technology such as AI-powered conversational interfaces, smart voice interfaces (e.g., Alexa, Siri), robotic vacuum cleaners or even social robots will continue to evolve and will inevitably shape individual experiences and society. In contrast to the ideal of embodiment relations in HCI, these artifacts are in a dialog with their users and do not necessarily extend them. They are, either by incident or by design, perceived as counterparts and imply an “alterity relation” [6]. It seems simply impossible to experience an anthropomorphic robot, such as Softbank’s Pepper, as an extension of one’s Self rather than as a self-reliant counterpart.
Obviously, counterpart technologies, or what we call otherware, require a different approach to interaction design than embodied technologies. So far, the prevailing approach is to mimic humans or animals, both in form and in interaction (i.e., anthropomorphism, zoomorphism). This fixation on a rather naïve anthropomorphism comes with advantages (i.e., an intuitive interaction borrowed from human-human or human-animal interaction) but also many disadvantages, such as reinforcing inappropriate gender stereotypes [1], or influencing the manners of children in yet unknown ways [13].
Technology in the form of quasi-lifeforms that pretend to have motives and emotions can be deeply disturbing and “uncanny”. In addition, naïve anthropomorphism might even be a barrier to unlock exciting potentials of otherware [2,15]. Welge and colleagues [15], for example, argued that robots have social superpowers, such as endless patience, just because of their mechanistic nature. In this case, a quality most people find important in social interaction is actually hard to attain for humans but easy for computational counterparts.
One might argue that otherware in general is a bad idea and should simply be replaced by embodied technologies. This is certainly true for voice interfaces, such as Alexa, where many interactions (e.g., switching on the light) could just be replaced by more traditional interactions. However, there are potential uses for technology that appears as a counterpart, yet shows different qualities compared to a human. Examples are motivational and persuasive application areas, such as virtual coaches [12], social robots to ease loneliness [4], music machines to stimulate creativity [8], therapeutic settings involving self-disclosure, or even the spiritual [10].
We argue that unlocking the powers of otherware requires a design approach different from naïve anthropomorphism or zoomorphism—an approach that keeps the alterity relation intact, yet clearly communicates the counterpart as different from humans or animals [9]. In other words, otherware needs to cultivate the otherness of machines in its design and interaction. Unique capabilities, such as endless willpower when trying to achieve a specific goal, or endless patience and interest leading boring conversations should be at the heart of future otherware designs.
Contact
If you have questions or other enquiries, just get in touch!