Project Summary
Note:  See post-GSF paper here. Contains further information/data about feedback-algorithm used in conjunction with LVQ filter (not included on these pages from length restrictions). Caution: extreme verbosity ahead.
----------------------------------------------------------------------------------------------------------------------------------------
 
Every year, amputees traverse thousands of miles in an effort to find a replacement for the irreplaceable. As any doctor knows, the arm and leg are two amazingly complex pieces of machinery that most of us take for granted. Therefore, current prosthetic technology does not allow the dexterity and maneuverability of the original limb, often taking a "Captain Hook" approach instead, in which case the patient simply discards the prosthesis and learns to function using the remaining limbs. In order to provide a sufficient limb replacement, the method of interfacing brain and machine must reach farther than analyzing nerve impulses; the most effective solution would interface directly with the brain itself, which is possible today using Electroencephalography.

Through extensive research, I found that brainwave data from an EEG goes through several stages of signal processing, data classification, and finally, user training before it can be implemented as a brain-machine interface. However, I also learned that current EEG signal processing programs still struggle with the age-old problems of noise filtering, artifact extraction, and pattern matching accuracy. These hindrances make the application of EEG technology toward prosthetics wholly impractical. From here, I wondered whether a program using both established and novel algorithms could be written to efficiently and accurately filter, parse, and detect patterns in incoming EEG signals, more so than current programs.

I created two custom scenarios in OpenViBE, one for the creation of training data and one for the active processing & filtering of incoming EEG signals. I then ran several prerecorded EEG datasets through the program to measure the overall accuracy and time needed for the creation of training configurations. Additionally, to measure the applicability and robustness of my program in the real world, I fabricated a robotic arm using off-the-shelf materials and tested my scenarios using live EEG streams from my own brain through a commercially available EEG headset. To perform both the simulation and live tests, I used the following variables:

Independent Variables(both live and simulation tests): Scenarios used for training and processing
Control Group: Default BCI Imagery scenarios provided with OpenViBE
Experimental Group: Custom training/processing/pattern detection scenarios

Dependent Variable (simulation testing): Number of trained patterns successfully extracted
Dependent Variable (live EEG testing): Number of successful, desired movements of the robotic limb

After analyzing the accuracy rates of both the software simulation and the live EEG tests, I found that, if the original scenario is modified such that Linear Discriminant Analysis is implemented for signal filtration and the Vector Quantization method (using a predetermined sampling rate) of signal epoching is incorporated, then the accuracy of positive pattern detection will increase to about 91.35% during software simulation and 78.32% during real-time "live" use.
 

EEG & Prosthetics- GSF 2011

 
(Transcription under description of video here: http://www.youtube.com/watch?v=lCqtJV36bz4)
Appropriate citations in Works Cited.
 


Project Summary
Fill out the project summary
 and be sure to include your video or presentation.


 


Judges' Tips An excellent film or presentation will provide a clear, brief overview of the question you are investigating, the stages of your project, what you set out to achieve and how far you succeeded.