This project involved the design and development of a custom robotic arm with real-time gesture recognition capabilities. I worked in a team of three people to program a Tiva C Series microcontroller (TM4C123GH6PM) to control the arm and implement machine learning for gesture classification.
Some key aspects:
Hardware Integration: The microcontroller was interfaced with various modules like UART, PWM, ADC, and μDMA to achieve precise control over the robot arm's movements.
EMG Sensor Integration: The system leveraged EMG sensors attached to a test subject's arm. These sensors captured electrical signals generated by muscle activity during various hand gestures (closing fist, "call me" sign, peace sign, open palm).
Data Acquisition and Processing: The captured EMG data was transmitted to the microcontroller for processing.
Machine Learning for Gesture Recognition: MATLAB's Classification Learner was used to train the system on the collected EMG data. This training enabled the system to differentiate between the various hand gestures based on the unique electrical muscle patterns associated with each gesture. The training model utilizes Linear Discriminant Analysis (LDA) which is a machine learning (ML) algorithm to train the robot arm
Robotic Arm Control: The trained system used the classified gesture information to control the robotic arm's movements in real-time, mimicking the performed hand gestures.
This project was completed for ENGR 844: Embedded Systems and demonstrates skills in:
Microcontroller programming using C language.
Hardware interfacing and control.
Machine learning concepts and implementation.
Project design and execution, including integrating various components into a functional system.
A short demo of this project can be seen in the video below: