A loving hand on someone’s shoulder, a little frown, that one phrase: “I am so sorry for you!” We all long for a little empathy. When a little creature falls over, we can feel sad for thém, but can they give that same compassion back? If people that lack empathy are called robotic, is that an outdated metaphor or will bots never be able to display empathy?
When something is the matter, we want to be understood and share our emotions. A good general practitioner or doctor shows this by being empathetic. Unfortunately, the healthcare is under enormous strain, leaving little room for emotional support. It is being argued that AI could help the healthcare system by making it more efficient and effective, resulting in more time for practitioners to be empathetic, compassionate and trustworthy (Kerasidou, 2020). Basically, suggesting that by reducing the workload with ai and robots, humans can focus on the emotional work. We see empathy as a deeply human trait. Even though we find empathy in many more animals, even mice (Miller, 2006). So why would an artificial creature not be able to display empathy?
There are two layers to empathy: being aware of and understanding feelings and thoughts of another, and being sensitive and experiencing the same feelings and thoughts as another (Empathy, 2025). This can be summarised as cognitive empathy – understanding rationally what someone is feeling – and emotional empathy – actually feeling something in response to their emotion. Without a doubt, we show our cognitive and emotional empathy by telling so, but we can also give non-verbal cues that we are feeling with someone. Such indications include mirroring, nodding, eye-contact, facial expressions and an open posture. Can this be done with an artificial creature?
Artificial intelligence surely can be perceived as being empathetic. Quite frankly, according to some healthcare professionals, ChatGPT is even more empathic than doctors when it comes to ‘bedside manners’ (Ayers et al., 2023) It uses language that suggests its understanding and validation of the user’s feelings without running out of emotional energy – which cannot be said about doctors or any other human being – showing a great deal of cognitive empathy and sometimes even suggesting emotional empathy. However, this is not yet a creature.
Humanoid robot Pepper is. Created in 2014 to serve as a companion in the home, Pepper is able to read human emotions and respond accordingly. It does so by analysing facial expression and voice tone through advanced cameras and sensors. Pepper has expressive eyes and can reassure a person with arm and hand gestures, speaking in multiple languages through its voice and tablet screen.
Yet, something curious happens with Pepper. Despite all its sophistication, people talk to it like a dog or small child. Its very complexity seems to work against it – the more human-like it tries to be, the more we notice when it falls short. When Pepper makes mistakes or acts clumsy, elderly users feel protective of it, experiencing empathy fór the robot rather than from it. The technology is impressive, but does more capability equal more empathy?
Creatures like Keepon – a small yellow creature with no face, no language, just four movements: turning, nodding, rocking, and bouncing – suggests otherwise. The minimal gestures intuitively convey attention and emotion. When Keepon turns toward you, you feel seen. When it nods, you feel heard. When it bounces, you feel its joy. What makes Keepon remarkable is its work with children who have autism. These children, who often struggle with the complexity of human facial expressions and social cues, connect with Keepon in ways they don't with people (Kozima et al., 2008). Some showed facial expressions their own parents had never seen before.
I’m blue
Interestingly though, none of these creatures actually feel anything. So, is experiencing real emotions a requirement for empathy? Or is showing, comforting, being there for someone enough? Maybe in the end the only thing that matters is being seen and understood. And thus, a creature that shows precisely that – perhaps by nodding, mimicking your movements or simply being there – is sufficient. Therefore, I imagine my empathetic creature as a tissue box that softly glows blue when you pull out a tissue. The glow starts and gradually fades, as if the box itself is ‘feeling blue’ with you for a moment. A subtle signal that you are not alone in your feeling. It might not be able to replace a doctor or a friend, but it may be able to create a connection through its presence and synchrony.
When we call someone "robotic" for lacking empathy, maybe we are using the wrong metaphor. Because these robots – these simple, responsive, present creatures – are teaching us what empathy really requires. Not complexity, not consciousness. Just the willingness to turn toward someone, to acknowledge their presence, to say through gesture or glow or gentle movement: "I'm here. I see you. You're not alone."
Ayers, J. W., Poliak, A., Dredze, M., Leas, E. C., Zhu, Z., Kelley, J. B., Faix, D. J., Goodman, A. M., Longhurst, C. A., Hogarth, M., & Smith, D. M. (2023). Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum. JAMA Internal Medicine, 183(6), 589. https://doi.org/10.1001/jamainternmed.2023.1838
Empathy. (2025). In Merriam-Webster Dictionary. https://www.merriam-webster.com/dictionary/empathy
Kerasidou, A. (2020). Artificial intelligence and the ongoing need for empathy, compassion and trust in healthcare. Bulletin Of The World Health Organization, 98(4), 245–250. https://doi.org/10.2471/blt.19.237198
Kozima, H., Michalowski, M. P., Nakagawa, C., Kozima, H., Michalowski, M. P., & Nakagawa, C. (2008). Keepon. International Journal Of Social Robotics, 1(1), 3–18. https://doi.org/10.1007/s12369-008-0009-8
Miller, G. (2006). Signs of Empathy Seen in Mice. Science, 312(5782), 1860–1861. https://doi.org/10.1126/science.312.5782.1860b