The next generation of Neuroprosthetic devices demands control systems that are accurate, adaptive, and seamlessly integrated with the user. This workshop brings together leading researchers at the intersection of neuroscience, robotics, and artificial intelligence to explore how diverse sensing modalities and advanced machine learning can be fused into robust, human-centric control architectures.
Multi-Modal User Intent Detection:
Combining EMG, EEG, mechanomyography, ultrasound, IMUs, and other biosignals for accurate, robust, context-aware decoding.
Continuous and Intuitive Control:
Moving beyond discrete gestures to smooth, adaptive motion via regression, hybrid, and reinforcement learning approaches.
Physics-Informed and Model-Driven Machine Learning:
Applying bio-mechanical models, musculoskeletal dynamics, and energy constraints to improve generalization, reduce data needs, and ensure safe closed-loop control.
Compliance as a Control Asset:
Using soft robotics’ compliance and modeling to simplify control, improve stability, and enable accurate state estimation.
Ethics, Safety, and Responsible Deployment:
Addressing data privacy, bias, and safety requirements for continuous, AI-driven neuroprosthetic control.
Imperial College London, UK
Technical University of Munich, Germany
Imperial College London, UK
University of Pisa, Italy
University of North Carolina at Chapel Hill, NC, USA
New York University (NYU), USA
New York University Abu Dhabi (NYUAD), UAE
Shanghai Jiao Tong University, China
New York University Abu Dhabi (NYUAD),
UAE