Remote Collaboration (Sharing experience and Information between Users)
3D Scene/Avatar Reconstruction
Interaction with virtual objects
Enhancing user experience (with virtual objects)
Uncontact virtual system
Capturing human emotion and physiological status & Sharing them
Education in AR/VR
한국연구재단 집단연구지원사업(기초연구실지원사업) 참여
2023 ~ 2025
인공지능융합혁신인재양성사업 참여
2023 ~ 2026
Mixed Reality (MR) Remote Collaboration System
Many researchers have studied three common visual communication cues (pointer, sketching, and hand gesture) in remote collaboration systems for real-world tasks. However, the effect of combining them has not been so well explored We studied the effect of four cue combinations: hand only, hand + pointer, hand + sketch, and hand + pointer + sketch, with three problem tasks: Lego, Tangram, and Origami. The study results showed that the participants completed the task significantly faster and felt a significantly higher level of usability with the additional sketch cue along with the hand gesture cue, but not with the additional pointer cue. Participants also preferred the combination with the sketch cue over the other combination. However, using the additional pointer and sketch cues increased the perceived mental effort and did not improve the feeling of co-presence. We discuss the implications of these results and future research directions.
SharedSphere is a Mixed Reality based remote collaboration system which not only allows sharing a live captured immersive 360 panorama, but also supports enriched two-way communication and collaboration through sharing non-verbal communication cues, such as view awareness cues, drawn annotation, and hand gestures.
Mirrors are physical displays that show our real world in reflection. While physical mirrors simply show what is in the real world scene, with help of digital technology, we can also alter the reality reflected in the mirror. The Augmented Mirrors project aims at exploring visualisation interaction techniques for exploiting mirrors as Augmented Reality (AR) displays. The project especially focuses on using user interface agents for guiding user interaction with Augmented Mirrors.
We have been developing a remote collaboration system with Empathy Glasses, a head worn display designed to create a stronger feeling of empathy between remote collaborators. To do this, we combined a head- mounted see-through display with a facial expression recognition system, a heart rate sensor, and an eye tracker. The goal is to enable a remote person to see and hear from another person's perspective and to understand how they are feeling. In this way, the system shares non-verbal cues that could help increase empathy between remote collaborators