The chassis of our robot is comprised of a union of 3d-printed parts and the OSEPP tank kit. This allows for a stable, reliable platform that can be customized according to the needs and requirements of the competition. One DC encoder motor powers each tread, enabling the robot to move precise distances across the field.
The Makeblock MegaPi is a main control board based on the Arduino Mega 2560. Using serial communication, we are able to have it work in tandem with the Raspberry Pi, moving the motors or checking sensors whenever deemed necessary by the main program. In our build, it is powered by a battery pack with six AA batteries.
Our robot uses two time of flight sensors, two ultrasonic sensors, and two cameras.
Two VL53L0X time of flight sensors are mounted on the front of the robot. They are each primarily composed of a class 1 infrared laser and receiver. The sensor measures the "time of flight" of the laser to produce an accurate distance reading. They are connected through GPIO to the MegaPi.
Datasheet found here .
One ultrasonic sensor is mounted on each side of the robot. The purpose of these sensors is to aid in obstacle avoidance - before making a turn in either direction to bypass the obstacle, the Raspberry Pi tells the MegaPi to read the two sensor values. If either one returns a distance that is too short for the robot to successfully make a turn, the robot will make a turn in the unobstructed direction.
Our robot uses two cameras, each connected to the Raspberry Pi. One is mounted on the front of the robot and is used in the evacuation room, while the other is mounted on the bottom of the robot and is used during line tracing. The latter of these cameras is accompanied by a LED light for better image processing. Our code uses the OpenCV library to analyze the feed of the cameras, the algorithm directing the robot's movements accordingly.
Initially, we were planning on using one camera situated on the front of the robot for both line tracing and the evacuation room; however, we found that the precision required for line tracing was not possible with this setup for two reasons: the distance from the center of rotation and the possibility of outside interference and noise. Having the camera close to the center of rotation of the robot is very useful in that it helps the robot stay on top of the line throughout the course, as it makes turns more precise. The body of the robot itself makes for an effective shield against outside interference like camera flashes, and the LED light provides a steady, bright light for consistent readings.
To collect the victims in the evacuation room, we decided to create a claw/forklift system for the level two zone. A rack and pinion system is equipped with two long appendages in the shape pictured.
The purpose of this shape is to have the ball roll to the end of the appendages once they are brought together by the rack and pinion. The flared end of the arm stops the ball from rolling too far, making it so that the distance between the victim and the body of the robot is consistent with each trial. This ensures that the robot does not have to perfectly align itself with the evacuation zone as it deposits the victim- instead, having the ball roll to the end of the arm allows the robot to simply be near the zone rather than right up against it.
This entire system is moved up over the treads of the robot with a conveyor system when not in use, letting the robot stay compact during the line tracing aspect of the competition.
Unfortunately, due to the ongoing pandemic, we were not able to acquire all of the parts necessary for the construction of this unit.
The 3D models of the rack and pinion system (left) and the arms (right) are pictured below.