We were tasked with creating 4 unique robots that complete tasks:
1. Zumi AI - navigates a course that simulates exiting a damaged building
2. Tamiya Tracked Vehicle - navigates a course that simulates exiting a damaged building
3. Arduino Based Robot - follows lines and avoids obstacles using a touch sensor as well as a line follower
4. JD Human Robot - locates and identifies victims in a damaged building
ZUMI AI
Zumi is a programmable, self-driving car kit developed to teach foundational concepts in Artificial Intelligence and robotics. It supports both block-based (Blockly) and text-based (Python) coding, enabling users to explore computer vision, machine learning, and autonomous navigation. Equipped with a camera and sensors, Zumi can detect objects, recognize faces and signs, and respond to its surroundings by making decisions such as stopping at a stop sign.
Looping Behavior:
The while loop means the robot will keep running this behavior as long as it's powered on.
Obstacle Detection:
Zumi constantly checks the IR (infrared) sensor readings from the front right and front left.
If both IR values are greater than 390, that likely means the path ahead is clear.
Moving Forward:
If the path is clear, Zumi moves forward for 1 second at a speed of 40.
Random Turn if Blocked:
If the path is not clear, Zumi:
Picks a random number (0 or 1) and stores it in key.
If key = 0, it turns left 140 degrees.
If key = 1, it turns right 140 degrees.
This gives Zumi a kind of “decision-making” behavior when blocked.
Timing:
There's a short 0.1 second wait at the end of each loop to prevent overload and smooth behavior.
Video in Action
Tamiya Tracked Vehicle
Tamiya robots are educational kits that teach students about robotics, mechanics, and basic programming through hands-on building. They include different types like mechanical robots, programmable robots with micro:bit coding, and remote-controlled models. The Robocraft series focuses on gear-driven motion, while programmable kits like the Cam-Program and Microcomputer Robots introduce coding and sensors. Most kits are easy to assemble with snap-fit parts and require no soldering, making them beginner-friendly. Overall, Tamiya robots are a helpful tool for learning STEM concepts in a fun and interactive way.
Motor and Gearbox:
The electric motor powers a gearbox, which reduces the speed and increases the torque, turning the drive sprockets.
Drive Sprockets & Tracks:
These sprockets pull the rubber treads (tracks) around the wheels.
The tracks wrap around multiple rollers and idlers, distributing weight evenly and increasing traction.
Battery Pack:
It uses two AA batteries to power the motor.
Chassis Base:
A pre-cut wooden or plastic frame holds all components together.
We built this tracked vehicle to learn the basics of robotics and mechanical engineering, especially how drive systems work. It helps us test how torque, traction, and gear ratios affect movement before applying those ideas to real-world robots. The wide tracks lets us simulate how our vehicle would move across rough terrain such as rocks, sand, or mud without sinking. It stays stable and can climb over small obstacles, making it much better than wheels on uneven ground.
C++ Code
Video of Tamiya in Action
The Tamiya detects an object in front of it and moves accordingly. It is a very versatile robot as it is designed like a tank and can go acroos all surfaces while still using its ultrasonic motion sensors to detect if anything is standing in its path.
ARDUINO BASED ROBOT
The RedBot is a robotics platform built around the RedBot Mainboard, which acts as both the motor controller and the brain of the robot. It is compatible with the Arduino IDE, making it easy for users to write code and take advantage of the Arduino programming environment. The board includes connectors for adding sensors and servos, allowing for a wide range of customizations like line following, obstacle detection, or sound. RedBot is designed for students, hobbyists, and beginners, and it comes with helpful tutorials and experiment guides. Overall, it’s a flexible and beginner-friendly system for building and programming small robots.
Video in Action
The Arduino-based RedBot follows a black line using infrared (IR) sensors mounted underneath it. These sensors detect the contrast between the dark line and the lighter surface by measuring reflected infrared light—black absorbs more IR, so it reflects less. The Arduino reads these sensor values and uses if-else logic to decide how to steer. If the sensor on one side sees black, it tells the robot to turn toward that side to stay on the line. This constant checking and adjusting lets the RedBot follow the path smoothly and accurately.
JD HUMAN ROBOT
JD uses a variety of robots to improve efficiency in its e-commerce and logistics operations. In warehouses, robots handle sorting, picking, packing, and transporting goods, including the use of robotic arms and mobile robots. For deliveries, JD employs autonomous vehicles and drones that can navigate roads, avoid obstacles, and deliver packages without human contact. Some robots are used for specialized purposes, like hospital deliveries or mobile retail with cold storage. These systems are powered by AI and machine learning, allowing them to make real-time decisions and work alongside humans for faster and more accurate service.
We tried to code the robot to be able to use facial recognition and be able to detect a human in its vicinity. When it finally detects the human in the "burning building," it calls out to the human to follow it out the door and into safety. The robot was programmed to be be able to detect and find the door in the room and walk out with the human following it.
We ran into hardships when dealing with the robot walking. His right kneecap was weaker than his left which caused him to drift right due to the energy imbalance. We worked our way through this with making the left legs speed slower and increasing the right legs speed. This change was a crucial part in helping our robot get the human to safety.
We were able to custom code the robot to be able to recognize everyone individually and call out their name if the robot sees them in the room. Additionally, we also added an emote for each person as something unique because we thought it was fun in the moment (even though this is an intense situation!).
Video of JD Saving Andy
The Team
Rozzi, Andy, Jackie, Dionysis, Lennie