Overview
Overview
While taking ME 135, we were tasked with creating a system that had incorporated real-time processing, multitasking and a GUI in LabView. My group and I, three members total, decided to build a 5-DOF robotic arm that could detect and grab blocks.
My personal contribution was manufacturing, electrical component selection and design and writing the Python code.
The field of robotics naturally lends itself to the implementation of a GUI, real-time programming, and multitasking. The general overview of our robotic claw is as follows: the user turns on the machine which proceeds to zero itself. The user is then presented with a GUI in LabVIEW which provides several options. From here, the user can manually rotate the claw to reset the search start point or launch the claw into its search mode. Once in search mode, the claw scans its surroundings for objects and once detected, attempts to grab the objects. The user can then decide to move or drop the object.
The interaction between the GUI and the ESP32 happens through the communication of LabVIEW with a MicroPython script loaded onto the ESP32. The script on the ESP32 has various states defined for the claw, corresponding to the functions described above. The ESP32 checks for inputs and sends outputs to LabVIEW at a sampling time defined in LabVIEW. LabVIEW will always send an input at the sampling rate defined and the ESP32 is always listening for an input before making its next loop. However, the states of the claw are only changed for specific inputs that are tied to the GUI buttons in LabVIEW. This allows for the real-time interrupts and interaction between the host GUI and the claw. For example, if a pause signal is received during an operation, that operation will not be completed by the ESP32. Instead, it will change states and wait for the next command. Furthermore, while looking for a block, the arm rotates and reads data from the ultrasonic sensor at a specified frequency. If an object is detected by the ultrasonic sensor the rotation of the arm will be immediately interrupted, positioning the claw in front of the object.
After the ESP32 receives and input from LabVIEW to update its state, it will finish the loop until returning to wait or the next input. During this loop, distance is read from the claw, stepper motors and servo motors are actuated, and information is sent back to LabVIEW. LabVIEW simultaneously processes the signals received from the ESP32 including distance information and processes the images from the camera attached to the claw. This allows for a user to simultaneously see what the claw sees as well as quantify exactly how far an object is. This data and image processing are run side by side on the host computer.
Another feature of our arm is its ability to dynamically move. Our movement script has each stepper motor actuate for one step then allows the other stepper to move one step. Done in a loop the steppers appear to be moving dynamically and simultaneously as shown in the video. The elbow servo is also actuated in a stepwise manner which is uncharacteristic for a servo motor. Actuating our elbow servo like a stepper motor decreases the angular momentum of the arm which increases stability and positionally accuracy.