Using the Leap Motion to enhance software accessibility

Home




A big thanks to Leap Motion, who sent me a development kit prior to the commercial release of the units, allowing me to get started with this project. I can see that porting some of my Kinect work to the Leap will make for an easier real World implementation of some of my disability control ideas. Below is a picture of Gregor testing out the Leap at Beaumont College.


Custom gesture recognition

We're after custom gesture recognition. Each student has a unique ability to make their own gestures and these are what we'd like to use as a virtual switch. The Leap does have a few 'standard' gestures built in to the API - but these are of little use to the user group at Beaumont. Pattern matching a custom 3D gesture, using some kind of a 3D convultion algorithm would cause my little brain to explode. So I'm projecting the 3D motion onto 2D planes and performing gesture recognition on each of those planes. I'm looking at the directional components of the hand motion. I got this idea from Al Sweigart's moosegesture project and shamelessly lifted his code. The picture below shows some of what I've got running, using VPython for the graphics. The ellipsoid represents the hand. I've put on an arrow at the front (just visible) and an arrow pointing down from the palm's position, so I can see that the software has got the hand orientated correctly. The Leap sometimes forgets what way around it is! The yellow trail shows where the hand has moved. I've projected the hand's motion onto the walls, which show up as orange curves. Only the last half second or so of the hand's motion is shown. 

The biggest increase to my productivity has come from moving to Python, using Eclipse. I'm not a trained programmer and the error messages from Visual Studio where bamboozling me. Each to his own; I just find that I understand what's going on with Python and Eclipse. It's not all plain sailing. The latest release of VPython is built on wxPython, so I had problems with trying to get a wxPython GUI going alongside the VPython window. In the end I embedded the wx display into a VPython window, but that took a little hammering out, as the display hierarchy isn't quite the same with VPython as with wxPython, but it all ran in the end. I was stubborn and wanted to use sizers in my display, rather than hard coding widget positions, which went out of fashion c1988.

Virtual buttons

Here's a picture showing one of the interface ideas using the Leap - the hand scales with distance to give a 3D feel. This is coded in C#, using Visual Studio. The oval buttons are push buttons - the user hovers over them and 'pushes' them to activate. The round buttons are simple 2D buttons which are activated with a hoverbutton - which I've set to have a half second delay. I'm using the Coding4Fun Kinect hoverButton as it has a good visual interaction. I found that I needed to smooth the hand motion as it got a little 'jittery' at times. The picture below shows a push button being activated - the orange bar shows how far the user has 'pushed' the button. The buttons can be located and sized to suit the user's range of motion and motor control. The program simulates a keyboard key press when the buttons are activated, allowing for control of other software. I got the interface to spoof a keyboard press so that it could interact with other software.


Leap Motion hoverHand demo


The students at Beaumont College use Sensory Software's Grid2 software. This proved to be the only package that I couldn't control using a software interface from my code! I tried many different methods, even resorting to writing an interface in C. Luckily, Steven Postlethwaite at Beaumont College told me that they managed to control the Grid using the MakeyMake. A little delving showed that this board is based on one of the newer Arduino circuits. So as a stop-gap I've connected an Arduino Leonardo to the PC, which pretends to be a keyboard. My program sends a character to the board when one of the controls is triggered. The Arduino then sends a simulated keyboard key press back to the PC. This is a real kludge! I'm using the Freetronics Leostickwhich has a small enough form factor to hide my shame. I'm hoping that Sensory Software will give me some tips on how to get this working entirely in software. Update: Sensory Software have given me some sample code to show me how I should be doing the software interface - thanks Barney! Update 2 - I can't get the code to work, so back to the Leostick...

Swipe gesture recognition using the Leap


At the suggestion of one of the IT folk at Beaumont, I coded up a simple swipe gesture recognition widget. This can be set to trigger at any angle by a simple swipe over the Leap. This is shown being tested by Tom at Beaumont in the picture above. I can see that with some tuning, it could be of use for some of the students there. But I really need to write some custom software to record whatever gesture the student can make - my software needs to fit with the end user's abilities, not the other way around. Testing is invaluable, as it is easy to forget just how difficult it is for some of the students to make simple movements that I take for granted. The picture below shows Jody testing out the Leap with the swipe recognition software. The arrow points to where the software thinks you were swiping to.



We also tried out the yetanotherleapmouse' demo written by Giancarlo Todon. I'll modify this to work with a restricted range of motion - this could be useful for people with a condition that prevent them from being able to move their hands over the range normally required to operate a mouse or touch pad, such as motor neurone disease.

Naturally, when I updated the Leap Motion SDK my code stopped working. A little delving and some help from Iris Classon's tutorial showed me where I had to tweak my Visual Studio set up. Turns out my code was fine - my IDE management wasn't! I probably spend as long trying to figure out the error codes on Visual Studio as I do coding.

I'd like to thank the Faculty of Science and Technology at Lancaster University who have been generous in funding this research and InfoLab21 who kindly gave me an Honorary position, allowing me to procrastinate in comfort, surrounded by the expertise I needed to help my transition from Hardware to software.
Comments