Objective: The Zumi bot needs to use infrared sensors to detect obstacles in front of it to avoid them and take a safer alternative route. It is supposed to simulate exiting a damaged building in a disaster scenario.
Code: Blockly was used to code the Zumi bot, and we focused on teaching it to respond with different turning reactions when it detects an infrared signal from either the front left or front right sensor. The data from the IR sensors indicates that the smaller the number, the closer an object is detected. By setting the code to respond when the sensor reading is less than 100, the Zumi bot can react appropriately to obstacles in its path.
Final Result: The Zumi can detect obstacles in front of it and will turn left or right depending on which sensor detects the obstacle first. If the front right sensor is triggered, it turns left; if the front left sensor is activated, it turns right. However, there are instances when the Zumi bumps into obstacles before turning. This inconsistency arises from the fixed interval at which it reads the infrared data and the varying distances between obstacles.
Objective: After assembling the Tamiya kit, we need to adjust it for navigating a course that simulates exiting a damaged building. The robot should incorporate an Arduino, an ultrasonic sensor, and a bumper sensor to perform the necessary tasks.
Code: The Tamiya robot's code is pasted from Sparkbot's code template: Exercise 5C. The robot uses an Arduino Uno board and feeds from a computer running the Arduino IDE software. The code details a fully autonomous robot with a front distance sensor to avoid obstacles. If an object is detected in front of it, the robot reverses and turns using the directional switching of the two front motors.
Final Result: We built the basic Tamiya bot following the instructions provided in the kit. After that, we figured out the Arduino attachment based on the previous skill-building exercise, 5C, as explained above. Once we confirmed that the Arduino was fully functioning, we worked on efficiently configuring all the necessary components onto the Tamiya bot.
Objective: The Arduino-based robot must be capable of following lines beneath it using a line follower mechanism. Additionally, it needs to detect and avoid obstacles in front of it with an ultrasonic sensor.
Code: The code was written based on the Sparkfun code templates 3B (distance sensor) and 5A (motor control). However, instead of using 3B's distance sensors, our Redbot had 3 IR sensors that could detect the black track made of tape. The main obstacle was making the robot change directions/correct itself when it was wandering off the black line. This was solved by a series of "if" statements and a relatively slow vehicle speed, which avoided sharp acceleration and sensor delays.
Final Result: We first worked on the Arduino sensor, as shown in the first video. The coding for the sensor enables the LED to turn red when it detects an object in front of it. Next, we transferred the sensor mechanism onto the robot base. With both the line tracking code and the sensor in place, the Arduino-based robot can successfully perform its tasks.
Objective: This AI-based humanoid robot is equipped with advanced camera sensors that enable it to detect and differentiate between various items, such as human faces and inanimate objects. It is designed to locate and identify victims in damaged buildings during disaster scenarios.
Code: The robot was programmed using Blockly with simple instructions. When the camera detects a face, it plays the phrase "I see a human." To add an extra flair, the robot also responds with a head bob.
Final Result: After first establishing a connection with the robot and testing its multiple features, such as the camera, various motions, and audio capabilities, we allowed it to perform the programmed instructions. Even from a distance, the humanoid robot was able to detect a face and execute the actions as instructed.