| Research Fields

R&D PROJECTS

Giant Finger: Visuo-proprioceptive congruent virtual legs for flying actions in virtual reality


Seamless walking in VR – Redirected Walking(RDW)

Electrical, Vibrational, and Cooling Stimuli-Based Redirected Walking: Comparison of Various Vestibular Stimulation-Based Redirected Walking Systems 


NUI modalities 

Survey of user's perception towards foot gesture and exploration of applicability


VR Education

The Thousand Character Classic VR: Chinese Characters-adaptive VR Game with Treadmill for an Embodied Learning of Their Korean Rendering


Interacting with Autonomous Vehicles

Exploring the effectiveness of public displays on interactive transparent displays inside autonomous public transportation (2022).


VR Haptic Device

Virtual Reality Interface Design to Improve Grasping Experience in Block Stacking Activities in Virtual Environments (2022)


Engagement Assessment for Children with Developmental Disabilities

Development of content creation and entertainment technologies based on intelligent authoring tool to enhance accessibility of social communication disabilities (2019-2021, KOCCA)

 More>>


Multisensory VR Forest

VR Forest-Walking Simulator to Support Forest Bathing in Urban Areas and its Physiological Effects (2021, KOFPI)

 More>>


Natural User Interfaces in Automated Vehicles

Multi-modal & Multi-user NUI Platforms through Transparent Displays in Public Transportation (2020-2022, KOCCA/KETI)

More>>


Culture Technology

Explainable Assessment of Contextual Visibility of Public Signage (2019-2021, KOCCA)

                                                                                                             More>>


Visual Analytic Tools

Making Machine Learning Applications for Time-Series Sensor Data Graphical and Interactive (2017, Expedition in Computing)

The goal of this project is to create a simple, usable tool for handling time-series sensor data streams in ML incorporated applications. 

                                                                              More>>


Game A.I.

Human Players vs. A.I Players in a Real-Time Strategy Game (2018)

Annual StarCraft artificial intelligence (AI) competitions have promoted the development of successful AI players for complex real-time strategy games. 

In these competitions, AI players are ranked based on their win ratio over thousands of head-to-head matches.

                                                                                                       More>>


Smart Workplace

Smart Workplace with a Hands-Free VR/AR HMD support (IITP, UNIST/KAIST)

A factory which have facilities and machines with Internet of Things (IoT) sensors 

                                                                                                             More>>


Wearable UI/UX

Iteratively Assessing and Enhancing Wearable User Interface Prototypes (2017, KETI) 

Design and  develop variants of advanced wearable user interface prototypes, including joystick-embedded, potentiometer-embedded, motion-gesture and contactless infrared user interfaces for rapidly assessing hands-on user experience of potential futuristic user interfaces.

                                                                                                             More>>

Forecasting Future Turn-Based Strokes in Badminton Rallies


Portal Display: screen-based 3D stereoscopic conferencing system for immersive social telepresence


Human (Low vision) - Mobile device Interaction 

A Study on User Experience of Mobile Wayfinding Application for Low Vision Pedestrians


Deep-Learning based Physiological Data Analysis

Deep Learning-Based Engagement Classification by Behavioral and Physiological Data of Children with Developmental Disability (2022)


VR Locomotion - Redirected Walking 

REVES: Redirection Enhancement Using Four-Pole Vestibular Electrode Stimulation. CHI 2022 LBW


Human-Vehicle Interaction

Take-Over Requests after Waking in Autonomous Vehicles (2022)


In-Car eXtended Reality 

GIST CarXR Lab: In-car XR platforms and contents to augment passenger UX in futuristic vehicles (2021, RAPA)


VR Locomotion - Redirected Walking 

Auditory and Olfactory Stimuli-Based Attractors to Induce Reorientation in Virtual Reality Forward Redirected Walking. CHI 2022 LBW


HCI+AI for Human-Centered Physical System Design

Topic 1.  Digital twin with mixed reality for AI-infused automotive systems

Topic 2.  Gamified crowdsourcing platforms for aging adults and disabled persons

Topic 3.  Visual analytic design tools for participatory and iterative urban planning processes



Autonomous Driving Simulators

High-Fidelity Simulation of Autonomous Driving with a VR/MR/Motion support (2019-2020, GIST GRI-GIAI / KOCCA) 

Build first person view virtual driving system for further Human-Vehicle Interaction research    

                                                                                                             More>>


Human-Vehicle Interaction

Driver-Aware Disengagement in Semi-Autonomous Driving Situations (2018, A.I. Basic Research) 

Real-time driver’s task load discrimination -> Driver’s attentiveness level classification and condition discrimination -> Suggest various situation based hand-over notification interface design guidelines for control of high reliable autonomous driving vehicles 

                                                                                                            More>> 



Human-Vehicle Interaction

User-Centric Intelligibility of Autonomous Vehicles (2018, A.I. Basic Research)

Exploring the modality that positively influenced the driver’s attitude when providing the combination of ‘Why message’ (e.g. construction site) and ‘How message’ (e.g. decrease driving speed) according to the behavior of the vehicle in the autonomous driving situation

                                                                                                             More>>


Human-Vehicle Interaction 

Multi-modal NUI (Nature User Interfaces) with a Large-scale AR HUD in Vehicles (2018, KETI)

Development of multi-modal NUI (Natural User Interface) based on voice command and touch gesture to improve user experience in vehicle


                                                                                                             More>>



Gamified Annotation Tools

Expertise-Free Annotation System for Korean Cultural Heritage Big Data (2018, KOCCA) 

New annotation system suggestion for meaningful data extraction and accurate data convergence of Cultural Heritage formative element -> Annotation data accumulation and big data analysis for diverse applications (2018 KOCCA)

                                                                                                             More>>

| Project Archive (2009 - 2016, CMU)

Project 1  

Sensor-based Assessment of In-Situ Driver Interruptibility (CHI 2015, T-SET UTC)

The goal of this project is to create an in-car information system that adapts the delivery timings of HCI demands to drivers based on in-situ driving and cognitive load models for safe navigation.  

More>>


Project 2 - HVI for the Elderly

Cognitive Mapping Aid for Elderly Navigation (CHI 2009, MTAP 2016, QoLT Center)

In this project, We explored a novel navigation display system that uses an augmented reality (AR) projection to minimize cognitive distance by overlaying driving directions on the windshield and road. 

 More>>


Project 3 - HVI for the Elderly

Multimodal Route Guidance and Its Reversal Effects, Elder vs. Younger (Pervasive 2012, MTAP 2016, QoLT Center)

While in-car navigation systems enhance situational awareness, they also increase drivers’ visual distraction and cognitive load. This project explores the efficacy of multi-modal route guidance cues for ‘safer’ driving. 

                                 More>>


Project 4 - HVI for the Elderly

Aesthetics and Usability of Automotive User Interfaces for Elder Drivers (CHI 2010, QoLT Center)

The purpose of this project was to design features for car dashboard displays that are both functional and aesthetically pleasing.  

                                                               More>>

Project 7

Adaptive Cyber-learning with a Sensor Support (CHI WIP 2014, QoLT Center) 

This project aims to better support student learning by adapting computer-based tutoring to individual learning phases and real-time capabilities. In this manner, computer-based tutors may be more effective in supporting robust learning.   

                                                                            More>>

Project 8 

Augmented Reality User Interfaces for Seamless Interaction with Virtual Spaces (JCAD 2010, GIST)

This project explored the effects of AR technology, when combined with a range of 3D prototype applications. 

                                                                            More>>


Project 9 

Modality Fusion during Touch-based Interaction (Samsung Electronics)

The goal of this project is to improve perception and performance during touch-based interaction in personal electronic devices. Specifically, we have identified the appropriate fusion of visual, audio, and haptic cues during fingertip interaction with touch screen images 

                                                             More>>


Project 10 

The Quality of HCI in Connected Environments  (KETI, DRAPER) 

This project aims to understand users-on-the-go in connected environments and to improve the quality of their ubiquitous HCI experience by enhancing machine intelligence to be more human-centered.   

                                                                                                             More>>

Project 5 

What, When, How: A Sensor-based Driver Awareness System to Improve Human-Computer Interaction (CHI LBW 2016, T-SET UTC)

The goal of this project is to help drivers safely interact with ubiquitous HCI demands and benefit from proactive information services in cars. Our prior and ongoing projects primarily explore the ‘interruptive’ feature of ubiquitous HCI demands in cars. We have been rigorously addressing the issues of when to intervene by using our sensor-based assessment technologies that estimate drivers’ cognitive load in near real-time.

                                                  More>>


Project 6 

Tracking Real-Time Mental Workload during Elementary Cognitive Process (UbiComp 2010, SSCI 2014, QoLT Center)

This project seeks to develop a sensor-based method for tracking variation in cognitive processing loads.

                                                   More>>

Project 11 

Driver-Aware Interruptions of Dialog-based HCI Demands in Cars (Hyundai, DRAPER) 

This project investigates how dialog-based HCI demands interact with driver interruptibility. We are refining our key technology, obtained from Project 1 in this document, to predict the duration of driver interruptibility. 

                                                                              More>>


Project 12 

Driver-Centered Interaction in Intelligent Automotive Physical Systems (Cyber-Physical Systems,  Smart and Autonomous Systems)

This proposed project aims to make machine intelligence-driven physical and non-physical interventions in computer-assisted driving more acceptable and dependable. 

                                                                              More>>