Research

I am currently employed as a Research Scientist in the Department of Haptic Intelligence at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany. Prior to that I conducted research at Yale University's GRAB Lab, the Bristol Robotics Laboratory, The University of Bristol, The Pervasive Media Studio (Bristol, UK) and The University of Reading.

My specialisms revolve around movement and sensing in natural and artificial hand/arm systems, whether this is in regard to designing interfaces to communicate with the fingers and hands via touch, or observing and defining the manipulation strategies of upper limb amputees.

The individual project pages are currently incomplete, but I am working on it. Academic papers related to these projects may be found on my Publications page.

Latest Projects

Variable Friction Robotic Fingers for Within-Hand Manipulation

Humans perform highly dexterous manipulation actions frequently and with little effort. A task such as picking up a pen for writing involves not just securely grabbing the pen, but also moving the pen within the hand to get it into a good position for writing. Such within-hand-manipulation is greatly aided by the unique frictional characteristics of our finger tips which allows gripping objects when high forces are applied, but sliding over objects with low forces.

In this research project I developed a new design of robotic fingers, which provide a mechanical analogy to the complex mechanics that leads to the frictional profiles of human fingers. These Variable Friction (VF) fingers, which are cheap and easy to build allow a very simple hand with low dexterity to sliding and rotate geometric objects within its grasp. Watch a video here.

IEEE Spectrum wrote a nice article about our 2018 RA-L / IROS paper. Click here to check it out.

Shape Changing Haptic Navigation Interfaces

An alternative form of indoor and outdoor guidance. These devices exploit the naturalistic human ability of haptic shape perception as a non-distracting alternative to visual, audio or vibrotactile navigation cues.

I am responsible for the mechanical, electronic and software design of several devices, which have been shown to work effectively for sighted and vision impaired persons, in a variety of scenarios, including outdoor navigation of a crowded college campus and indoor navigation within a pitch black immersive theatre installation.

Studying Upper Limb Prosthesis Use with Head Mounted Cameras to Inform Device Design

The capabilities of upper-limb prosthetic devices fall far behind that of an intact limb. Limited mobility, proprioception and haptic feedback are all factors that severely limit functionality of even the most sophisticated terminal devices. This leads amputees to improvise manipulation strategies.

In this ongoing work I use head mounted GoPro cameras to record many hours of amputees using their prosthetic devices to complete domestic tasks in their own home. I then developed a framework for numerically quantifying their methods of using their prosthetic and upper limb.

Single Grasp Tactile Object Identification for Simple Robotic Hands

Tactile sensing combined with proprioception enables humans to effortlessly identify various characteristics of objects by holding and manipulating them. Various groups have developed complex robotic hand sensor suites and exploratory algorithms that attempt to replicate such functionality, but have impractical exploration time requirements (several minutes) or hardware costs (>$10K). My colleagues and I developed a $500 robotic hand with simple haptic sensors and a combined machine learning and parametric algorithms. This system can identify an object and report it's shape and stiffness properties while completing a single functional grasp. Click the image for a video.

Robotic Gripper Design

I have either directly designed or contributed to the design of several robotic end effectors (grippers). The open source M2 (multi-modal) gripper and VF gripper (whose first publication is currently under review) were completed at Yale and based on the OpenHand architecture. A 'palpating gripper' design was based on my observations of the hand motions performed by surgeons as they palpate (manipulate) simulated tissue to glean structural information.


Multi-DOF Tactile Interface for Suture Tension Perception in Tele-operated Surgical Robots

Surgical robots, such as the da Vinci Surgical System provide numerous benefits to surgeons, however the lack of haptic (touch) feedback has been known to cause a variety of issues during surgical procedures. Applying an incorrect amount of tension to sutures can lead to serious complications for a patient.

In this work I investigated various methods of providing intuitive suture information to users, without destabilizing the high-gain closed loop robotic motion control system (which would lead to oscillations). Pyshcophysical methods were used to compare interface performance.

Remote Haptic Sensing of Endowrist Surgical Tools

The da Vinci Surgical System is a tele-operated robot that makes use of interchangeable dexterous tools with articulating wrist mechanisms. The disposable nature of these tools, combined with the need for frequent high-heat sterilization makes it difficult to equip such devices with haptic sensing. This lack of haptic sensing and feedback is a major drawback in current surgical robots.

I developed a compact and affordable haptic sensor interface that attaches to the actuators of the tool and therefore remains outside the patient's body. This avoids the need for intense sterilization while permitting haptic sensing and feedback from the tool.

Virtual Prototyping of Experimental Laparoscopic Surgical Tools for Proof-of-Concept User Testing

Minimally Invasive (keyhole) surgery has many benefits to the patient (such as reduced scarring and recovery time) but the long and thin tools are difficult to use for a surgeon and take years to master. Actively inverting surgical tools could potentially negate a major source of motion difficulty, however these are mechanically challenging to fabricate.

To investigate the potential of inverting tools without dealing with mechanical overheads I created a user interface that coupled inverting tool graphics with realistic haptic sensations. A study compared novice user performance with standard and inverting tools over a number of weeks.

Haptic Object Avoidance Tools for the Study of Perception

This line of devices began with my undergraduate major project 'the Haptic Torch' (2004), a device designed to aid obstacle detection for vision impaired persons. A major feature of that device was the novel haptic interface, which I later discovered fit into the classification of 'skin-stretch' feedback.

Since then a variety of 'torch' devices have been created and adopted for perception research across the world. The latest iteration, the MinET is shown in the image.

Biologically Inspired Control Strategies for Synthesizing Human-like Movement in Robots

Humanoid robots are a growing area of robotics with the potential for work along people in the unstructured environments of the 'real world'. Though much effort has been placed on developing humanoid robot hardware, the control strategies of these systems are often based on the techniques used in industrial assembly robots, which are inappropriate due to unnatural and unpredictable trajectories combined with rigid, high gain joint control.

My PhD involved developing non-linear control based models of human motions that could be implemented by humaoid robots. My thesis (with some additions) was recently published as a book by Springer.