Smart prosthetics are no longer passive tools—they respond to intent, adapt to the environment, and offer customization through apps or controllers. But:
Many users struggle to understand or trust what their devices are doing.
Designers often focus on function, not the experience.
Clinical and engineering teams lack a shared framework for evaluating interaction.
This leads to frustration, underuse, or even abandonment of otherwise advanced technologies.
I approached this as a human-centered systems challenge and asked:
What kinds of interactions exist between wearers and their prostheses?
How can we define usability goals across physical and digital touchpoints?
What framework can guide teams in designing and evaluating these experiences?
I introduced a multi-layered model to evaluate and improve how users interact with intelligent lower-limb prostheses. Here’s how it breaks down:
1. Two Types of Interaction
Direct interaction: Happens during movement (e.g., walking, balancing). It’s continuous and often unconscious.
Indirect interaction: Happens through external interfaces like mobile apps or switches—when users adjust modes or settings.
2. Three Human Factors Dimensions
I organized influencing factors into a simplified systems model:
Wearer (their goals, knowledge, and physical/cognitive state)
Device (its automation, transparency, and responsiveness)
Task (context-specific demands like walking on slopes or navigating stairs)
3. Usability Principles Mapped to Each Interaction
Borrowing from UX best practices (Nielsen's heuristics), I aligned interaction types to usability needs:
1. Direct interaction (e.g., walking):
Needs to feel intuitive and seamless.
Prior knowledge helps but can’t be the only solution.
Transparency (e.g., real-time feedback) improves user confidence.
2. Indirect interaction (e.g., configuring settings):
Should prioritize learnability and error prevention.
Interfaces must guide users through complex settings in simple ways.
Personalization features can increase satisfaction, but only if users understand them.
Don't assume users "get it." Smart systems need to explain themselves.
Match the interface to the context. Real-time walking vs. menu navigation are different tasks—design accordingly.
Build for trust and adaptability. Transparency and feedback loops matter.
Involve users early. Design research must include the people who rely on these tools daily.
The framework I developed has since been used to:
Guide design discussions with developers of next-gen prosthetic interfaces.
Shape evaluation criteria for lab-based and in-the-field testing of smart prostheses.
Support onboarding and training for new users by aligning system design with mental models.
This work also laid the foundation for creating design guidelines that extend beyond prosthetics—into any assistive tech where users interact with autonomous systems.
Bai, X., Yuan, J., Liu, M., Huang, H., & Feng, J. (2024). Human factors considerations of interaction between wearers and intelligent lower-limb prostheses: a prospective discussion. Journal of NeuroEngineering and Rehabilitation. Read the publication