Mats Doornbos, Timothy Schuur, Payman Sipass
In some animals, self-harm occurs under prolonged stress or deprivation, but full-blown self-destructive acts are rare. Suicide in humans often stems from overwhelming hopelessness or the belief that there is no way out. It is a final, self-inflicted act that signals deep emotional pain. Unlike humans, our robotic creature has no inner life or feelings. Yet by scripting a “suicidal” gesture, we can explore how we project meaning onto machines.
Our installation begins with a small, constantly driving, sad-looking robot held back by a thin rope. Visitors are invited to sever its restraint, “freeing” the creature. Immediately, the robot raises its arms toward a power supply, then bolts forward. Upon contact, a short circuit causes a small capacitor inside the robot to explode. This is our robotic equivalent of suicide. The sudden bang and sudden silence leave spectators unsettled, and wonder: Why do we feel guilt or sorry for what is plainly a programmed piece of technology?
The moment the bang faded and the robot was left motionless, many people fell quiet. Some even feel sorry. Cutting the rope can feel like a kind act, even though it leads to the robot’s "death." At the same time, seeing the small explosion, even knowing the robot cannot feel pain, can still make people feel sad or guilty. This paradox shows how easily we imagine machines have thoughts and feelings. Contradictorily, many people applauded the robot’s death, which was an unforeseen and intriguing extra aspect of the project.
This installation explores how humans project emotions onto machines. By creating a scenario where a robot "takes its own life," it plays on our tendency to attribute human-like qualities to technology.
This provokes ethical questions about AI and robotics. If machines start to mimic emotions convincingly, how will we treat them? It also touches on the emotional impact of technology in our lives and raises concerns about the future of AI. In this project we highlighted how we project care and moral responsibility onto artificial creatures, even when they cannot feel.