Postdoctoral research fellow
Education and Academic Experience
Postdoctoral researcher 010/2024 - Present
Department of Biomedical Engineering
City University of Hong Kong
Advisor: Prof. Xinge Yu
Doctor of Philosophy (Ph.D.) 09/2020-09/2024
Department of Biomedical Engineering
City University of Hong Kong
Advisor: Prof. Xinge Yu
Bachelor of Engineering (B.Eng.) 08/2016-09/2020
Department of Biomedical Engineering
City University of Hong Kong
Research Interests
Human-machine interfaces
Flexible electronics
Haptic feedback
Electrical stimulation
Research Vision
Integration of cross-disciplinary technologies and knowledge
Novel interaction paradigms
Innovative materials and advanced fabrication strategies
Human–Computer Interaction inspired design principles
Built upon soft, wearable electronics
Toward Advanced Human–Machine Interaction systems enabling:
Next-generation biomedical applications
Human-in-the-loop control of robotic and AI systems
Immersive interaction in Virtual Reality, Metaverse, and humanoid robotics
About my research
My previous research has focused on developing advanced human–machine interaction systems enabled by flexible electronic platforms. These works can be categorized into two main directions:
Epidermal Augmented Human–Machine Interface
Advanced Sensory Feedback Interfaces
The following section introduces several of my research papers, in which I served as first or co-first author.
1) Epidermal Augmented Human-Machine Interaction
Combines control and sensory feedback functionalities within epidermal and wearable electronic systems, enabling intuitive operation and rich information transfer during interactions with machines or computers.
The approach aims to augment control performance and reduce the cognitive demands on human operators through human-centered interaction design.
1. Yiu, C.†, Liu, Y.†,... & Yu, X. (2025). Skin-interfaced multimodal sensing and tactile feedback system as enhanced human-machine interface for closed-loop drone control. Science Advances LINK
Skin-interfaced systems enable gesture control and multimodal sensory feedback, enhancing operator performance with low cognitive demand through tangible interaction.
Drone control is achieved via simple yet intuitive wrist-angle rotation by the user.
3×3 haptic feedback array on the finger, combined with a spatial downsampling method of haptic actuators, translates continuous drone posture information into tactile cues. This improves flight stability under varying aerodynamic conditions while maintaining a low cognitive load.
Obstacle detection in blind areas triggers electrical stimulation when an obstacle is within a dangerous distance. The stimulation induces muscle contraction in the user’s wrist, providing both a warning signal and an assisting corrective force. This achieves a human-in-the-loop decision-making design for potential collision avoidance.
NMES-based force feedback system for flight path correction
2. Liu, Y.†, Yiu, C†... & Yu, X. (2022). Electronic skin as wireless human-machine interfaces for robotic VR. Science advances LINK
An embedded electronic platform constructed using flexible circuit techniques.
Integration with strain sensors placed on different parts of the human operator to capture motion and wirelessly transmit it to a robotic system or prosthetic arm.
Pressure sensors on the prosthetic fingers capture haptic signals, which are then regenerated at the corresponding locations on the human operator via wearable electromagnetic haptic actuators integrated into the flexible embedded platform.
Facilitating haptic sensation during robotic/prosthetic control through flexible and wearable electronics, enabling the operator to perform precise tasks with delicate targets (e.g., wireless saliva sampling on a human subject using a prosthetic arm).
An early attemp of cooperation between flexible eleltronic platform and prosthetic arm, enabling closed-loop, human-in-the-loop operation that prioritizes human-centered interaction.
The prosthetic arm performs saliva sampling, while haptic feedback ensures safe force application without harming the patient.
Wearable electromagnetic haptic actuators
2) Advanced Sensory Feedback Interfaces
Apart from system-level integration of control and feedback devices for human–machine interaction, the development of advanced sensory feedback technologies is also crucial for achieving highly efficient interaction and immersive experiences.
The goal of this research is to develop systems capable of delivering more realistic and dynamic multimodal feedback, thereby regenerating sensory perception during interactions with computers, robotic platforms, or virtual reality environments.
Haptic feedback
1. Yao, K.†, Zhou, J.†, Huang, Q.†, Wu, M.†, Yiu, C. K†... & Yu, X. (2022). Encoding of tactile information in hand via skin-integrated wireless haptic interface. Nature Machine Intelligence LINK
A wearable hand patch fabricated using flexible electronic technology, enabling electrotactile feedback on the user’s skin to generate tactile sensations.
Compared to electromagnetic mechanical haptic actuators commonly used in previous human–machine interaction systems, the patch achieves a significantly smaller form factor.
Comprehensive user studies conducted to investigate individual differences under varying stimulation parameters.
Integrated control circuit to regulate current pulses and facilitate wireless communication with external systems.
Interaction with a self-developed virtual reality environment, enabling regeneration of haptic feedback and even the perception of “pain” in a virtual setting.
Generation of a “pain” sensation by the flexible electrotactile hand patch when the user touches a cactus in the VR environment.
Olfactory and gustation feedback
2. Liu, Y.†, Yiu, C. K.†,... & Yu, X. (2023). Soft, miniaturized, wireless olfactory interface for virtual reality. Nature Communications LINK
3. Liu, Y.†, Jia, S.†, Yiu, C. K.†, ... & Yu, X. (2024). Intelligent wearable olfactory interface for latency-free mixed reality and fast olfactory enhancement. Nature Communications LINK
4. Liu, Y.†, Park, W.†, Yiu, C. K.†,... & Yu, X. (2024). Miniaturized, portable gustation interfaces for VR/AR/MR. Proceedings of the National Academy of Sciences LINK
Wearable olfactory feedback system that generate smell in VR interaction
Portable lollipop-shaped gustation interfaces
Controllable release of chemical compounds into the air or onto the user’s tongue to generate olfactory or gustatory feedback.
Concentration control of released compounds at specific moments in time to produce olfactory or gustatory sensations with varying intensity.
Integration with a VR environment, where the release of smell or taste is triggered by virtual context parameters (e.g., wind speed, distance).
Future Research Interests
Building on my research vision and previous work, I am particularly interested in the following directions:
Advanced Haptic Feedback Interfaces
Inspired by the development history of visual displays, haptic feedback interfaces with higher resolution, larger effective area, improved wearability, and more realistic stimulation are essential for advancing human–machine interaction in robotics, virtual reality, biomedical applications, and beyond.
Electrical stimulation-based electrotactile and neural interfaces represent a promising route for achieving ultra-high-resolution haptic feedback and for quantifying user perception. However, further studies on human body electrical properties, electrode design, circuit and system-level implementation, as well as overcoming key engineering challenges, are required.
Advanced Human–Machine Interaction Systems
With the development of new control and feedback technologies, advanced human–machine interaction systems can be realized.
Beyond improving operator control or enhancing immersive VR experiences, interaction should be bidirectional—benefiting both the human operator and the machine being controlled.
The wearable nature of epidermal electronics makes them ideal as general interaction platforms with AI or technological devices in daily life.
As humans play a central role in interaction design, the user’s experience, reactions, and perceptions—often underestimated in traditional engineering—should be fully considered. Concepts from human–computer interaction should be integrated into the design of advanced epidermal human–machine interfaces.
Electronic engineering:
Circuit design
Microcontroller (STM32, ATMEGA)
Electronic circuit simulation (LTSpice)
PCB Layout and assembly (EasyEDA, Altium Designer)
Development instrument operation (Oscilloscope, Function generator…)
Digital and Analog signal processing
Programming:
Python
C++
C#
MATLAB
Virtual Reality development:
Oculus development
Unity3D
Design and fabrication:
AutoCAD
Stretchable electronic fabrication
Photolithography
Laser cutting