For our CPRE 2880: Embedded Systems I: Introduction group project, we developed an semi-autonomous remotely controlled robot designed to navigate and map a test field while interacting with specific objects and boundaries. Our goal was to implement a functional autonomous vehicle (AV) capable of using sensors to detect obstacles, avoid hazards, and stay within defined field boundaries, while updating a graphical user interface (GUI) with real-time position and field data. The project required integrating various components, including bump and cliff sensors, ADC, servo, and PING and IR sensors for scanning, remote control, and decision-making abilities, to allow the robot to both navigate independently and respond to commands. Our work involved both hardware and software tasks, such as writing and calibrating code for object detection and mapping, and ensuring reliable performance under battery constraints.
Each group was tasked with developing a unique application scenario for their autonomous vehicle (AV) project. Our team chose a farming scenario, inspired by the needs of modern agriculture. Our AV was designed to map a field remotely for a farmer unable to be present on-site, providing a visual overview through a user interface (UI). The robot's role was to survey the field, avoid obstacles like trees and rocks, and stay within designated boundaries, all while capturing relevant field data. Given the simulated 30-minute battery life, the AV needed to balance mapping efficiency with power conservation, attempting to return to a charging station before the battery was depleted.
My primary role was to develop and calibrate the scan code that enabled our CyBot to detect and measure objects on the test field. I used the ADC, servo, and PING and IR sensors components to achieve accurate scanning, and I spent over 12 hours refining this code to ensure it could reliably identify objects and their widths. My work was crucial for the robot’s navigation and object-avoidance capabilities, as it allowed the CyBot to assess its surroundings accurately within our field-mapping scenario. Through this experience, I learned how essential it is to fully commit to my specific tasks to strengthen the team’s overall performance and make sure other team members can depend on my work.
Farm Application Diagram
The Cybot Robot
CyBot Code Logic Diagram
Through this project, I gained hands-on experience in embedded systems, coding, and sensor integration, which are crucial skills for engineering. I developed and tested scan code that utilized the ADC, servo, and PING and IR sensors to scan the field and detect objects, spending extensive time calibrating the system for accuracy. I also learned how to manage multiple components within an autonomous vehicle, balancing navigation, communication, and sensor data handling. The project strengthened my understanding of teamwork, especially the importance of dividing tasks based on strengths and ensuring each part is thoroughly tested. Additionally, I honed my problem-solving skills, troubleshooting issues in real-time to optimize the robot's performance in the test field. This experience provided insight into the challenges of integrating different systems and the iterative nature of engineering design.
The scan code I wrote relied on the ADC, servo, and PING and IR sensors to detect objects and measure distances, requiring me to reference datasheets and troubleshooting guides to ensure accurate sensor readings. Additionally, I consulted course materials, documentation, and team discussions to refine the robot’s communication protocols and ensure seamless integration of the platform components. These resources, along with collaboration from teammates and support from TAs, were instrumental in successfully completing the project.