•Hybrid human machine interface towards touch-aware bionic (roboic) arm controls using EEG and/or EMG in collaboration with Dr. Ranaweera, University of Peradeniya, Sri Lanka
The goal of this project to develop a touch-aware bionic (robotic) arm control interface using physiological signals (EEG, EMG, etc.). For this project, wearable haptic sensing and feedback technologies are combined with EEG/EMG based control system and the impact is investigated cross various (visual, haptic, and both) feedback conditions for specific tasks such as grasping, exploring, and object moving tasks.
• Wearable haptic orthosis for hemiparetic patients in collaboration with University of Paris-Est Creteil (UPEC), Paris, France
The goal of this project is to design/develop a new haptic ankle-foot orthosis that can be an ultimate rehabilitative solution to existing orthosis technologies in improving the performance of rehabilitation. In this endeavor, we, two laboratories, LISSI (Laboratory of Images Signals and Intelligent Systems) at UPEC (University of Paris Est Creteil) in France and ICTL (Immersive Computing for Touch Laboratory) at Kent State University, OH, have been closely working together as one research team to integrate Angle Foot Orthosis (AFO) controls into a perceptual haptic interfac since 2017. This project was initiated with a STAR 2017, Korea-France International Joint Research, grant funded in 2017. This research focuses on haptic (kenesthetic and tacile) feedback design, adaptive AFO controls, and gait cycle analysis with healthy and hemiparetic patients.
Haptic assistive technology for the rehabilitation of Parkinson's Disease
Prior studies by Dr. Angela Ridgel (KSU) demonstrated that rehabilitation with a button-pressing task significantly reduces the tremors of Parkinson's Disease (PD) patients. The goal of this research is to develop a variable and programmable training system using VR and haptic devices to reduce tremors in collaboration with the Physiology department. We developed a prototype of a single manual haptic interface for upper limb rehabilitation and are now extending this to include Mixed Reality and haptic gloves for a bimanual and immersive haptic interface.
Wearable hand exoskeleton systems have been developed with virtual reality (VR) and force feedback for upper limb rehabilitation. However, force feedback does not provide intuitive guidance or realistic feedback for reaching and grasping tasks. To address this issue, we are developing a vibrotactile feedback interface that is not only intuitive but also portable to any existing systems such as virtual reality and exoskeleton haptic gloves. For this, piezoelectric actuators or LRM actuators are used for providing various patterns of tactile feedback to fingertips or hands while interacting with virtual objects.
Improving Engagement and Retention for Upper Limb Parkinson’s Rehabilitation or Education with Game Design integrated into Haptic Feedback and Extend Reality (XR)
The goals of this ongoing research project are (1) to build a prototype of an immersive game interface to help in upper limb rehabilitation for individuals suffering from conditions that hinder upper limb motor function caused by Parkinson’s disease by using haptic feedback combined with virtual reality as a more engaging way of encouraging patients to perform required exercises and train hand motor skills; (2) to validate the usability and efficacy of the game system towards upper limb rehabilitation use; (3) to investigate how gamification enhances training or education; and (4) to conduct transformative research by extending the idea to (STEM) education or training fields. A haptic stylus pen (Phantom Premium 1.5 or Geomagic Touch) will be used as the main method of interacting with the game interface and the main means of providing resisting force during the exercises, and the use of an XR (VR or MR) headset as a means of making the game experience more engaging and immersive. To collect data with the prototype system, participants complete predefined task sessions and provide the usability rates after the sessions. Participants’ skills and efficacy are assessed using task-specific data (task completion time, hand motion tracking, task score within the time limit, difficulty levels, etc.) automatically recorded in the system for each session. Based on our iterative development cycle, the system is being improved and refined with the collected user data.