Our lab develops novel technologies to make better use of human-generated signals including brain, eye, hands, gait, posture, and cognitive load. We use this information for the purpose of building natural interfaces between humans and machines as well as health evaluation and monitoring.
Application areas include
Intelligent Interface for Prosthetic Control,
human-robot interaction,
rehabilitation,
exercise monitoring,
surgical simulation and training,
We envision a world where we use multiple signals from muscular, eye-tracking, and brain activity sensors to improve the performance of human/machine interfaces. Currently our researchers are developing affordable next generation interfaces for prosthetic hand control. These interfaces are envisioned to enable amputees to use a prosthetic hand as naturally and effortlessly as an intact hand.
Our research in this area focuses on integrating computer vision, muscular sensors, haptics, and machine learning techniques to provide a better quality of life for those living with disability.
An examplar work: Autonomous prosthetic hand control
Precise estimation of human movements such as gait parameters, postures, and hand gestures and kinetic forces have important applications in rehabilitation and recovery, sports performance evaluation and injury prevention, prosthetics and orthotics, and human machine interaction.
Advanced wearable technologies, signal processing and machine learning algorithms offer great potential to enhance the accessibility and affordability of these technologies. They enable us to collect measurements in outdoor and daily living settings without heavy and specialized equipment. Our research targets robust and accurate real-time estimation, using fewer sensors and less-cumbersome systems.
Humans are often involved in situations which require a great deal of acute awareness and mental capacity. Objectively measuring the task workload in a high-demanding occupation such as surgery or equipment operation is important for performance and safety. Electroencephalography (EEG) is a traditional tool for mental workload assessment. We study a novel and more easy-to-use eye-tracking technique to reveal mental workload during these tasks to predict when an operator may be experiencing task-overload. To gain a broader understanding of this type of stress, we examine employing both EEG and eye-tracking (including pupil dilation) for mental workload measurement in both individual and team-based work settings.