The goal of this project is to enable unique gesture recognition for people with limited control of their motion. Most all gesture recognition software focuses on the user replicating a predefined gesture, such as a swipe or sign-language. This is not practical for the students that I work with at Beaumont College. We intend to recognise whatever motion a student can comfortably make and then use this to trigger a control.
The photo below shows preliminary testing at Beaumont. The pouch on the student's wrist contains a microcontroller board and an XBee module for wireless data transmission. An accelerometer is attached to the back of the hand. As the student moves, data is collected by the microcontroller and sent to a laptop for analysis. Eventually we hope to do all of the processing and gesture recognition with the microcontroller.
Initial research used the Leap Motion. We found that the space that participants could interact with the Leap Motion was too limited for our user group. The code developed for recording and matching gestures will be tested with other technologies though, so the time was not wasted.
To continue with this research, we will use accelerometers to measure hand motion. Initially we will process and pattern match this data in a laptop to recognise a student's gestures.
I display real-time accelerometer data from sensors on an mpu6050 board connected to a Pyboard. The user interface is written in pyside and uses the pyqtgraph to display the accelerometer data for the x,y, z axis. I set up two way communication with the pyboard and can set the accelerometer sampling frequency from the interface. The pyboard is programmed with micropython, so the entire tool chain from the hardware to the user interface is python 3. This code is built on the work of other programmers, who kindly put their code online. The YouTube video below shows a recording of an early version of the interface, with data being displayed from the accelerometer real-time and the sample rate being changed. As work progresses I'll update this site and the video.
I will make all of the code available once the project is finished. I am using an assembla repository to store the code and would encourage any programmer to set up a git repository. You might think that you have adequate backups of your code...
A number of smart watches, such as the Pebble, which have accelerometers built in, were trialed. These are designed to pair with a smartphone using Bluetooth Low Energy (BLE) and periodically send and receive data, not to constantly stream accelerometer data to a PC, which is what is required for the initial development work. It would have been nice to get something working reliably with this platform, as the students at Beaumont would be quite happy to wear the latest smartwatch. The only smartwatch I found with a stable and reliable link with a PC was Texas Instruments EZ430-Chronos. This comes with its own receiver dongle, so there is no issue in setting up a reliable link between this watch and a PC. However, the data sampling rate is limited.
I found a few Sparkfun WiTilt accelerometer and gyroscope sensor boards in the lab, left over from a long dead project on tracking people indoors. These are well designed boards with both a wired and an old school Bluetooth interface. Using the Bluetooth interface is a pain as for each iteration of code the device has to be reconnected. Using the wired serial interface allows for faster iterations of code as there is no re-connection to do each time the software is changed. I got this streaming accelerometer data to my laptop. However, this device is no longer manufactured. I emailed Sparkfun who said they had no plans to make anymore. So I started to look at what we can get off the shelf now. Using live hardware allows for others to easily replicate and improve anything that I come up with.
The Pyboard caught my eye. This runs micropython, which allows me to program the board using a version of the Python programming language. As this is the language that I use for my gesture recognition code, I figured this gives me a chance to eventually have all of the pattern matching done on the board. Initially I will take the accelerometer data from the board real time and process it on a laptop. Having the pattern recognition done on the accelerometer hardware will make for a better device that does not need to be constantly paired with a PC. The board will chug away on its own and when it recognises a gesture, send out a signal. That's the plan anyway.
I'd heard of micropython for a while, but I was brought up on the ethos that with firmware 'if you can't do it in C, do it in assembler. If you can't do it in assembler, it is not worth doing'. Then I listened to a podcast on micropython here and figured it was about time I stopped being such a curmudgeon. There are two types of fool. One says 'this is old and therefore good' and the other says 'this is new and therefore better.' With hardware design, I get to be both at the same time.
So far I have interfaced the pyboard with an mpu-6050 accelerometer/gyrsocope board. If you want to get one of these, look on eBay where you will find these boards for a few pounds. I modified code from this project on Hackaday which is the site for the discerning electronics enthusiast. I am streaming data from the sensors to my laptop. I need to add some error checking to flag if there are missing data samples and compensate for these and to check the sampling rate is correct. Then write some unit tests, to avoid being a hardware design hypocrite.
As with any new platform, I encountered the usual World of Pain. I managed to install a micropython package over one of the regular python packages on my laptop. I never did figure out how to fix this. As luck would have it, I had a clonezilla image from the night before, which only took 20 nail biting minutes to load. Matt's top tip - use clonezilla and use it often!