Welcome to HCIS Lab! 

We conduct interdisciplinary research that combines human-computer interaction (HCI) and artificial intelligence (AI) technologies to design and demonstrate 'human-centered' interactions between humans and physical systems.

We collaborate with experts in computer science, electrical and mechanical engineering, as well as AI, Robotics and Extended Reality (XR). Our research focuses on understanding how AI-driven contextualization and decision-making impact the user experience. We aim to provide proactive interactions by predicting user state, intent, and context.

Our goal is to address the technical and societal challenges that arise when interacting with soft-robotic systems, automated vehicles, actuated XR platforms, generative AI tools, and intelligent systems for vulnerable populations in our everyday lives. We strive to nurture interdisciplinary talents capable of defining these challenges and providing human-centered computing solutions.

Join Us

We are currently accepting applications for graduate student and undergraduate internship positions at the intersection of HCI, AI & Robotics, XR, Haptics, and Automotive UIs.

To apply, please email me at seungjun@gist.ac.kr with your CV and a document that provides an overview of your work experience, research activities, technical skills, and publication list.

Human-AI Interaction (HAI) & HCI+AI Laboratory? 

HCIS 연구실은 인간-컴퓨터 상호작용(HCI)인공지능(AI) 기술을 결합하여, 인간과 물리 시스템 사이의 인터랙션을 '사람 중심'으로 설계하고 검증하는 학제 융합 연구를 진행합니다. 

컴퓨터 과학, 전기 및 전자공학, 기계공학은 물론 확장현실(XR), 인공지능, 소프트 로봇 분야의 인재들이 협력하여, 인공지능이 이끄는 상황 인식과 의사 결정이 사용자 감각-인지 메커니즘(감각(sensing)>지각(perception)>인지(cognition))에 미치는 영향을 분석합니다. 이러한 분석을 통해 사용자의 상태, 행동, 의도, 맥락을 예측하고 유연성(fluidity) 높은 인터랙션을 제공하는데 중점을 두고 있습니다. 

우리의 일상생활 공간에서 자율주행 이동체, 구동형 XR 플랫폼, 생성형 AI 콘텐츠, 설명 가능한 AI(XAI) 도구와 상호작용하며 발생하는 기술적 문제, 사회적 문제를 정의하고, '사람 중심' 해결책을 제안할 수 있는 융합 인재를 양성하는 것을 목표로 합니다.

ErgoPulse: Electrifying Your Lower Body With Biomechanical Simulation-based Electrical Muscle Stimulation (CHI '24) 🏆

SYNC-VR: Synchronizing Your Senses to Conquer Motion Sickness for Enriching In-Vehicle Virtual Reality (CHI '24) 🏆

LumiMood: A Creativity Support Tool for Designing the Mood of a 3D Scene           (CHI '24) 

GaitWay: Gait Data-Based VR Locomotion Prediction System Robust to Visual Distraction (CHI LBW '24)

Curving the Virtual Route: Applying Redirected Steering Gains for Active Locomotion in In-Car VR (CHI LBW '24)

The Way of Water: Exploring Interaction Elements in Usability Challenge with In-Car VR Experience (Virtual Reality '24)

Enhancing Seamless Walking in VR: Application of Bone-Conduction Vibration in Redirected Walking (ISMAR '23) 🏆

Giant Finger: A Novel Visuo-Somatosensory Approach to Simulating Lower Body Movements in Virtual Reality (ISMAR '23)

Designing Virtual Agent HMI Depending on the Communication and Anthropomorphism Levels in Augmented Reality (AutoUI '23) 🏆

Multi-Layer Multi-Input Transformer Network (MuLMINet) with Weighted Loss (IJCAI CoachAI Badminton Challenge  '23) 🏆

Electrical, Vibrational, and Cooling Stimuli-Based Redirected Walking: Comparison of Various Vestibular Stimulation (CHI '23)ㅇ 

REVES: Redirection Enhancement Using Four-Pole Vestibular Electrode Stimulation (CHI EA '22)

Auditory and Olfactory Stimuli-Based Attractors to Induce Reorientation in VR Forward Redirected Walking (CHI EA '22)ㅇ 

Toward Immersive Self-Driving Simulations: Reports from a User Study across Six Platform (CHI '20)

A Crowdsourcing System for the Elderly: A Gamified Approach to Speech Collection (CHI EA '20) 

Awards / Grants

Media / News