Anything about Empatic Computing with AR/VR
Remote Collaboration (Sharing experience and Information between Users)
3D Scene/Avatar Reconstruction
Interaction with virtual objects
Enhancing user experience (with virtual objects)
Uncontact virtual system
Capturing human emotion and physiological status & Sharing them
Education in AR/VR
Gesture Generation for Virtual Agents on Persuasive Communication
Publication: Gayun Suh, Gun A. Lee, Hyung-Jeong Yang, Soo-Hyung Kim, Ji-eun Shin, Jaejoon Jeong, Sei Kang, and Seungwon Kim, "Effects of Co-speech Gesture Size of Virtual Agents on Persuasive Communication", In Proceedings of the 30th ACM Symposium on Virtual Reality Software and Technology (VRST), Motreal, Canada, 2025. doi: 10.1145/3756884.3766037
Co-speech gestures are crucial for enriching both human-human and human-agent communications. Yet, the specific impacts of gesture size—especially when generated by advanced data-driven techniques—remain underexplored. This study investigates how varying gesture sizes affect human-agent interactions across two distinct persuasive contexts (informational and emotional), with a focus on social outcomes such as persuasion and empathy. We conducted two controlled experiments, each involving 36 participants, comparing three gesture conditions: Minimal gesture, Small gesture, and Large gesture conditions.
Virtual Agent for Interpersonal Emotion Regulation in VR
Publication: Sei Kang, Gun A. Lee, Hyung-Jeong Yang, Soo-Hyung Kim, Ji-eun Shin, Jaejoon Jeong, Myungho Lee, and Seungwon Kim, "Design and Evaluation of a Virtual Agent for Interpersonal Emotion Regulation in VR", In Proceedings of the IEEE International Symposium for Mixed and Augmented Reality (ISMAR), Daejeon, South Korea, 2025, pp. 1554-1564. DOI: 10.1109/ISMAR67309.2025.00159
Managing negative emotions through emotion regulation (ER) is key to mental well-being. While virtual reality (VR) shows promise for supporting ER, prior work has primarily focused on self-regulation. This paper introduces a virtual agent that helps users manage emotions through conversation-based ER strategies. We compared three conditions: no agent, an agent with non-supportive responses, and an agent with ER-supportive responses. Results showed that the ER-supportive agent significantly improved users' emotional states and overall experience. Building on this, we conducted a second experiment to examine how the agent's appearance (realistic vs. cartoon) and voice tone (emotional vs. neutral) affect ER.
Enhancing Emotional Expression on Embodied Avatar Face in VR
Publication: Jaejoon Jeong, Gun A. Lee, Hyung-Jeong Yang, Soo-Hyung Kim, Ji-eun Shin, Gayun Suh, Sei Kang, and Seungwon Kim, "Three Techniques for Enhancing Emotional Expression on Embodied Avatar Face in VR", In Proceedings of the IEEE International Symposium for Mixed and Augmented Reality (ISMAR), Daejeon, South Korea, 2025, pp. 23-33. DOI 10.1109/ISMAR67309.2025.00016
This study introduces three techniques in VR that enable users to manually adjust emotional facial expression while still reflecting real-time facial tracking results. The Ekman (Ek) technique allows users to select six discrete emotions via button interaction, while the Scrollable-Ekman (SEk) technique extends this by allowing users to scale the intensity of the selected emotion. The Arousal-Valence (AV) technique offers nuanced control within a two-dimensional arousal-valence space. We evaluated these techniques against a baseline condition that synchronizes users’ natural facial expressions, focusing on the expression of happiness, sadness, and anger.
Mixed Reality (MR) Remote Collaboration System
Publication: Kim, S., Lee,G., Huang, W., Kim, H., Woo, W., & Billinghurst, M. (2019, May). Evaluating the combination of visual communication cues for HMD-based mixed reality remote collaboration. In Proceedings of the 2019 CHI conference on human factors in computing systems. (CHI’ 19), 2019, pp. 1-13, ACM, New York, NY, USA. https://doi.org/10.1145/3290605.3300403
Many researchers have studied three common visual communication cues (pointer, sketching, and hand gesture) in remote collaboration systems for real-world tasks. However, the effect of combining them has not been so well explored We studied the effect of four cue combinations: hand only, hand + pointer, hand + sketch, and hand + pointer + sketch, with three problem tasks: Lego, Tangram, and Origami. The study results showed that the participants completed the task significantly faster and felt a significantly higher level of usability with the additional sketch cue along with the hand gesture cue, but not with the additional pointer cue. Participants also preferred the combination with the sketch cue over the other combination. However, using the additional pointer and sketch cues increased the perceived mental effort and did not improve the feeling of co-presence. We discuss the implications of these results and future research directions.
Publication: Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Sharedsphere: MR collaboration through shared live panorama. In SIGGRAPH Asia 2017 Emerging Technologies (pp. 1-2) (https://doi.org/10.1145/3132818.3132827);
Lee, G. A., Teo, T., Kim, S., & Billinghurst, M. (2017). Mixed reality collaboration through sharing a live panorama. In SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications (pp. 1-4) (https://doi.org/10.1145/3132787.3139203)
SharedSphere is a Mixed Reality based remote collaboration system which not only allows sharing a live captured immersive 360 panorama, but also supports enriched two-way communication and collaboration through sharing non-verbal communication cues, such as view awareness cues, drawn annotation, and hand gestures.
Publication: Lee, G. A., Kim, S., Lee, Y., Dey, A., Piumsomboon, T., Norman, M., & Billinghurst, M. (2017, November). Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In ICAT-EGVE (pp. 197-204).
Mirrors are physical displays that show our real world in reflection. While physical mirrors simply show what is in the real world scene, with help of digital technology, we can also alter the reality reflected in the mirror. The Augmented Mirrors project aims at exploring visualisation interaction techniques for exploiting mirrors as Augmented Reality (AR) displays. The project especially focuses on using user interface agents for guiding user interaction with Augmented Mirrors.
We have been developing a remote collaboration system with Empathy Glasses, a head worn display designed to create a stronger feeling of empathy between remote collaborators. To do this, we combined a head- mounted see-through display with a facial expression recognition system, a heart rate sensor, and an eye tracker. The goal is to enable a remote person to see and hear from another person's perspective and to understand how they are feeling. In this way, the system shares non-verbal cues that could help increase empathy between remote collaborators.