| Research Fields

R&D PROJECTS

Categories:  HCI+Mobility    Accessibility    Locomotion    NUI    VR/AR    Education/Training    Haptics    Gamification    HCI+AI  

 Education/Training   HCI+AI 

Counterfactual Explanation-Based Badminton Motion Guidance Generation Using Wearable Sensors

 Haptics   VR/AR 

Dual-sided Peltier Elements for Rapid Thermal Feedback in Wearables


 Accessibility   Haptics   VR/AR 

WatchCap: Improving Scanning Efficiency in People with Low Vision through Compensatory Head Movement Stimulation

 Locomotion   HCI+AI   VR/AR 

GaitWay: Gait Data-Based VR Locomotion Prediction System Robust to Visual Distraction 


 HCI+Mobility   Locomotion   VR/AR 

Curving the Virtual Route: Applying Redirected Steering Gains for Active Locomotion in In-Car VR

 Haptics   Locomotion   VR/AR 

ErgoPulse: Electrifying Your Lower Body With Biomechanical Simulation-based Electrical Muscle Stimulation Haptic System in VR

 HCI+Mobility   Haptics   VR/AR 

SYNC-VR: Synchronizing Your Senses to Conquer Motion Sickness for Enriching In-Vehicle Virtual Reality

 HCI+AI 

LumiMood: A Creativity Support Tool for Designing the Mood of a 3D Scene


 HCI+Mobility   VR/AR 

The Way of Water: Exploring the Role of Interaction Elements in Usability Challenges with In-Car VR Experience

 Education/Training 

MultiSenseBadminton: Wearable Sensor–Based Biomechanical Dataset for Evaluation of Badminton Performance

 Accessibility   Education/Training 

Engagnition: A Multi-Dimensional Dataset for Engagement Recognition of Children with Autism Spectrum Disorder

 Locomotion   VR/AR 

Effect of Optical Flow and User VR Familiarity on Curvature Gain Thresholds for Redirected Walking

 Accessibility   NUI   Locomotion   VR/AR 

Enhancing Wayfinding Experience in Low-Vision Individuals through a Tailored Mobile Guidance Interface

 NUI 

A Study with Portal Display, a Head Pose-Responsive Video Teleconferencing System

 Locomotion   VR/AR 

Enhancing Seamless Walking in Virtual Reality: Application of Bone-Conduction Vibration in Redirected Walking

 HCI+Mobility   HCI+AI   VR/AR 

What and When to Explain? On-Road Evaluation of Explanations in Highly Automated Vehicles


 HCI+Mobility   VR/AR 

Designing Virtual Agent Human-Machine Interfaces Depending on the Communication and Anthropomorphism Levels in Augmented Reality

 Education/Training   HCI+AI 

Multi-Layer Multi-Input Transformer Network (MuLMINet) with Weighted Loss

 HCI+Mobility   VR/AR 

Assessing the Impact of AR HUDs and Risk Level on User Experience in Self-Driving Cars: Results from a Realistic Driving Simulation

 Locomotion   NUI   Haptics   VR/AR 

Giant Finger: A Novel Visuo-Somatosensory Approach to Simulating Lower Body Movements in Virtual Reality

 HCI+AI 

Simulating Urban Element Design with Pedestrian Attention: Visual Saliency as Aid for More Visible Wayfinding Design

 Education/Training   VR/AR 

Logogram VR: Treadmill-Coupled VR with Word Reflective Content for Embodied Logogram Learning 

 Locomotion   Haptics   VR/AR 

Electrical, Vibrational, and Cooling Stimuli-Based RDW: Comparison of Various Vestibular Stimulation-Based Redirected Walking Systems 

 Locomotion   VR/AR 

Evaluation of Visual, Auditory, and Olfactory Stimulus-Based Attractors for Intermittent Reorientation in Virtual Reality Locomotion

 HCI+Mobility   VR/AR 

Take-Over Requests after Waking in Autonomous Vehicles


 HCI+Mobility   NUI 

Naturalistic Ways for Drivers to Intervene in the Vehicle System while Performing Non-Driving Related Tasks

 HCI+Mobility   NUI 

Gaze-Head Input Examining Potential Interaction with Immediate Experience Sampling in an Autonomous Vehicle

 Gamification 

Cultural Heritage Design Element Labeling System With Gamification


 HCI+Mobility   NUI 

A Cascaded Multimodal Natural User Interface to Reduce Driver Distraction


 Gamification 

Designing a Crowdsourcing System for the Elderly: A Gamified Approach to Speech Collection 

 HCI+Mobility   VR/AR 

Toward Immersive Self-Driving Simulations: Reports from a User Study across Six Platforms 


 HCI+Mobility   VR/AR 

A New Approach to Studying Sleep in Autonomous Vehicles: Simulating the Waking Situation

 HCI+Mobility   VR/AR 

MAXIM: Mixed-reality Automotive driving XIMulation


| Project Archive (2009 - 2016, CMU)

Project 1  

Sensor-based Assessment of In-Situ Driver Interruptibility (CHI 2015, T-SET UTC)

The goal of this project is to create an in-car information system that adapts the delivery timings of HCI demands to drivers based on in-situ driving and cognitive load models for safe navigation.  

More>>


Project 2 - HVI for the Elderly

Cognitive Mapping Aid for Elderly Navigation (CHI 2009, MTAP 2016, QoLT Center)

In this project, We explored a novel navigation display system that uses an augmented reality (AR) projection to minimize cognitive distance by overlaying driving directions on the windshield and road. 

 More>>


Project 3 - HVI for the Elderly

Multimodal Route Guidance and Its Reversal Effects, Elder vs. Younger (Pervasive 2012, MTAP 2016, QoLT Center)

While in-car navigation systems enhance situational awareness, they also increase drivers’ visual distraction and cognitive load. This project explores the efficacy of multi-modal route guidance cues for ‘safer’ driving. 

                                 More>>


Project 4 - HVI for the Elderly

Aesthetics and Usability of Automotive User Interfaces for Elder Drivers (CHI 2010, QoLT Center)

The purpose of this project was to design features for car dashboard displays that are both functional and aesthetically pleasing.  

                                                               More>>

Project 7

Adaptive Cyber-learning with a Sensor Support (CHI WIP 2014, QoLT Center) 

This project aims to better support student learning by adapting computer-based tutoring to individual learning phases and real-time capabilities. In this manner, computer-based tutors may be more effective in supporting robust learning.   

                                                                            More>>

Project 8 

Augmented Reality User Interfaces for Seamless Interaction with Virtual Spaces (JCAD 2010, GIST)

This project explored the effects of AR technology, when combined with a range of 3D prototype applications. 

                                                                            More>>


Project 9 

Modality Fusion during Touch-based Interaction (Samsung Electronics)

The goal of this project is to improve perception and performance during touch-based interaction in personal electronic devices. Specifically, we have identified the appropriate fusion of visual, audio, and haptic cues during fingertip interaction with touch screen images 

                                                             More>>


Project 10 

The Quality of HCI in Connected Environments  (KETI, DRAPER) 

This project aims to understand users-on-the-go in connected environments and to improve the quality of their ubiquitous HCI experience by enhancing machine intelligence to be more human-centered.   

                                                                                                             More>>

Project 5 

What, When, How: A Sensor-based Driver Awareness System to Improve Human-Computer Interaction (CHI LBW 2016, T-SET UTC)

The goal of this project is to help drivers safely interact with ubiquitous HCI demands and benefit from proactive information services in cars. Our prior and ongoing projects primarily explore the ‘interruptive’ feature of ubiquitous HCI demands in cars. We have been rigorously addressing the issues of when to intervene by using our sensor-based assessment technologies that estimate drivers’ cognitive load in near real-time.

                                                  More>>


Project 6 

Tracking Real-Time Mental Workload during Elementary Cognitive Process (UbiComp 2010, SSCI 2014, QoLT Center)

This project seeks to develop a sensor-based method for tracking variation in cognitive processing loads.

                                                   More>>

Project 11 

Driver-Aware Interruptions of Dialog-based HCI Demands in Cars (Hyundai, DRAPER) 

This project investigates how dialog-based HCI demands interact with driver interruptibility. We are refining our key technology, obtained from Project 1 in this document, to predict the duration of driver interruptibility. 

                                                                              More>>


Project 12 

Driver-Centered Interaction in Intelligent Automotive Physical Systems (Cyber-Physical Systems,  Smart and Autonomous Systems)

This proposed project aims to make machine intelligence-driven physical and non-physical interventions in computer-assisted driving more acceptable and dependable. 

                                                                              More>>