In Week 2 of the Discover Engineering Camp, we explored several different robots: the Zumi bot, an AI drone car capable of recognizing and navigating around obstacles; Tamiya, a robot equipped with ultrasonic sensors, a databot to locate people; and the Redbot, designed for controlling motors and responding to sensor inputs.
Our Zumi is able to recognize a colour and print the name of the colour on its screen
Our zumi is able to drive forward, backwards, left, and right. This is through displaying gestures through the camera of our laptop
The code for our Zumi is programmed in python. This code is for the colour detector where whenever the user presses enter, Zumi prints the colour that is being held infront of the camera.
The Redbot is a robot that is able to follow a line and move based off of the direction that the line is going. For example, if the line is going right, then the robot will follow the line and turn right with it. When the robot detects that there is an object infront of it or if the track has fully stopped, then the robot would halt. At first, we had some issues with finding the right speed for the Redbot to travel. It was not able to travel at a pace where it can follow the line, however, in the end, the Redbot worked.
The Tamiya Tracked Vehicle is a robot that is able to avoid obstacles and locate survivors from the disaster. The robot uses arduino, a databot that uses an infrared thermometer locate survivors, and touch sensors. Unfortunately, we faced several issues with the code for the motors, leaving us limited time to test the databot with the robot.
Our VEX simulation utilized a VEX robot that has three line sensors. The robot is to follow the line path and reach the bar line at the end of the course where the robot stops.