Most mobile devices equipped with touchscreens provide on-screen soft keyboard as an input method. However,many users are experiencing discomfort due to lack of physical feedback that causes slow typing speed and error-prone typing, as compared to the physical keyboard. To solve the problem, platform-independent haptic soft keyboard technologies suitable for tablet-sized touch screens have been proposed and developed. The evaluation results confirm platform independence, fast tactile key click feedback, and uniform tactile force distribution on touchscreen with using only piezoelectric actuators. This research includes perception based tactile signal design, piezo actuator design, microcontroller based haptic driver development, real-time key click feedback, and typing performance measures. Our focus is continuously improving the haptic soft keyboard using multiple piezoelectric actuators with a mobile battery.
Vision-based hand gesture interactions are natural and intuitive when interacting with computers, since we naturally exploit gestures to communicate with other people. However, it is agreed that users suffer from discomfort and fatigue when using gesture-controlled interfaces due to the lack of physical feedback. To solve the problem,we propose a novel complete solution of a hand gesture control system employing immersive tactile feedback to the user’s hand. For this goal, we first developed a fast and accurate hand-tracking algorithm with a Kinect sensor using the proposed modified local binary pattern (MLBP) that can efficiently analyze 3D shapes from depth images. The superiority of our tracking method was verified in terms of tracking accuracy and speed by comparing with existing methods, Natural Interaction Technology for End-user (NITE), 3D Hand Tracker and CamShift. As the second step, a new tactile feedback technology with a piezoelectric actuator has been developed and integrated into the developed hand tracking algorithm, including the DTW (dynamic time warping) gesture recognition algorithm for a complete solution of an immersive gesture control system. The quantitative and qualitative evaluations of the integrated system were conducted with human subjects, and the results demonstrate that our gesture control with tactile feedback is a promising technology compared to a vision-based gesture control system that has typically no feedback for the user’s gesture
we proposed a new haptic-assisted virtual cane system operated by a simple finger pointing gesture. The system is developed by two stages: development of visual information delivery assistant (VIDA) with a stereo camera and adding a tactile feedback interface with dual actuators for guidance and distance feedback. In the first stage, user’s pointing finger is automatically detected using color and disparity data from stereo images and then the 3D pointing direction of the finger is estimated with its geometric and textural features. Finally, any object within the estimated pointing trajectory in 3D space is detected and the distance is then estimated in real time. For the second stage, identifiable tactile signals are designed through a series of identification experiments, and an identifiable tactile feedback interface is developed and integrated into the VIDA system. Our approach differs in that navigation guidance is provided by a simple finger pointing gesture and tactile distance feedback are perfectly identifiable to the blind.