Hardware Developers: Atharva Shaligram, Abigail Izzo
Software Developers: Jason McCauley, Eli Shtindler, Chris Spadavecchia
The robot's architecture is built around the TurtleBot 2 platform, providing a robust foundation for autonomous navigation. For computer vision, it utilizes an Intel RealSense Depth Camera D435, which captures depth and color data to map the environment and detect weeds. A LiDAR sensor complements the depth camera by enhancing localization accuracy and promoting precise obstacle detection and avoidance. To support the intensive, computation tasks, such as real-time navigation and sensor integration, the NVIDIA Jetson™ TX2 NX acts as the primary motherboard, running the software stack, including ROS.
The OpenCR board handles low-level hardware control, managing the motors and wheels for precise positioning and movement. A HooToo shuttle USB hub also provides external storage and seamless connectivity between the various components. This hardware architecture allows the robot to effectively navigate its environment, detect weeds, and remove them autonomously.
The software architecture utilizes Robot Operating System (ROS) to handle obstacle detection and navigation for the robot. The NVIDIA Jetson™ TX2 NX acts as a central processing unit, reading data from an Intel RealSense Depth Camera and LiDAR sensor to map the surrounding environment, detect obstacles, and navigate safely. The camera data is also sent to a cloud server, where the YOLO algorithm processes the images to identify weeds. Results from the cloud are subsequently transmitted back to the NVIDIA Jetson™ TX2 NX, and the weed detection data is integrated with the obstacle avoidance algorithm to determine the robot's next actions.
Once a path is determined, the NVIDIA Jetson™ TX2 NX communicates with the OpenCR board, which controls the robot's motors and wheels to move accordingly, such as driving towards the weed or avoiding obstacles. The separation of low-level motor control and high-level image processing not only ensures efficiency, but also modularity, allowing the robot to autonomously navigate and position itself for weed removal. Additionally, all software components are thoroughly tested in simulation environments, such as Gazebo, before deployment, ensuring smooth integration and performance in the real world.
Project Management:
All files and documents are contained in a shared Google Drive folder, making it easy for the team to collaborate. Additionally, an iMessage group chat is used to communicate regularly, whether it be to discuss the work that has to be completed or when we are to meet outside of class. Typically, the team meets in-person three times a week to discuss the progress that has been made and any concerns that arise.
Fall Semester (9/3 - 12/13):
10/09 Milestone 1
Customers
Problems
Needs
Requirements
11/27 Milestone 2
Project Plan
Roles and Responsibilities
Work Breakdown Structure
Concepts
Hardware
Product Formulation
Design
Analysis
Test Plan
Planning and Research
Presentation
Winter Break:
Dataset Creation
Utilize tools like Roboflow to create, label, and preprocess datasets tailored for distinguishing between weeds and desirable plants.
AI Model Training
Employ Edge Impulse or similar platforms to train machine learning models optimized for edge devices.
Spring Semester (1/21 - 5/9):
Milestone 3
Testing Plan
Poster draft
Verification Stage
Prototype Testing
Modification
Beta Stage
Implement Changes
Poster Development
Poster Mid-Project Submission
Poster Final Submission
Milestone 4
Testing plan
Discuss future implementations
Senior Innovation Expo (May 9, 2025)
Acceptance Criteria
Autonomous Weeding
The robot is fully autonomous in identifying and removing weeds without requiring user intervention.
It consistently achieves a high success rate in eliminating weeds while minimizing errors.
Advanced Image Processing
The robot leverages advanced computer vision algorithms to distinguish between desirable plants and weeds with high precision.
It can adapt to various plant types, shapes, and growth stages to avoid disturbing or damaging favorable plants.
Durability and Adaptability
The robot can navigate uneven terrain and avoid obstacles while maintaining stable performance.
Eco-Friendly Design
Its weed removal mechanism avoids using harmful chemicals, promoting a sustainable gardening approach.
Success Criteria
Weeding Performance
Accuracy: The robot achieves at least 95% accuracy in distinguishing weeds from desirable plants across diverse plant species and growth stages.
Efficiency: The robot consistently removes weeds with minimal error in real-world garden conditions.
Image Processing Capability
Accuracy: It maintains high performance with minimal false positives or false negatives when classifying plants.
Out of Scope
Charging Capabilities and Docking Station
The design and development of the robot's charging system and autonomous docking station are beyond the scope of this project.
Mobile Application Interface
The creation of an integrated mobile app or user interface for controlling or monitoring the robot will not be included.
Setup Process
Developing a streamlined or automated setup process for end-users is not within the scope of this product.
Assumptions
We assume that our primary target audience consists of elderly gardening enthusiasts who face challenges in performing weeding tasks due to the risk of physical strain or injury.
We assume that the garden terrain is completely level, with no significant slopes or uneven surfaces.
Dependencies
We will not be developing or designing the processing units used in our robot. Instead, we will rely on third-party providers to supply high-quality processors that meet our performance requirements.
We will not be designing or manufacturing the wheels for our robot's navigation. Instead, we will source high-quality wheels from third-party providers and select the most suitable option for our project needs.
Constraints
Our project is scheduled for completion by late April 2025.
Our project must be cost-effective, ensuring all expenses remain within the allocated budget.
Based off our analyses, we have decided to use a mechanical blade, like a weed whacker, to eliminate weeds. We believe that it is the most simple, yet efficient method to kill the weeds with the time we have.
Chemical sprays are easily crossed off the list because of their harm to the environment, a downside that many of our competitors such as Dandy overlooks.
Lasers are also crossed off the list due to their inherent dangers and costs, as well as the time commitment required to implement. We only have 2 semesters to complete this project and a limited budget, which is not enough time nor money to install fully-functioning lasers.
Finally, mechanically pulling out the weeds were crossed off the list. While this method has its advantages, particularly killing the weeds at the root, we believe it will be too costly and complex to implement due to the extreme precision required to remove the weeds. If the robot is only an inch off from detecting the weed, the robot will entirely miss the weed, whereas a mechanical blade has a larger room for error in detecting the weeds.
Parts List:
System Architecture Diagram:
Activity Flowchart:
Hardware:
Software:
The team will have several testing phases:
Hardware Testing (early spring):
Sensor accuracy: Ensure the sensors accurately detects obstacles ahead
Weed whacking mechanism: Ensure the weed whacker can cut through tougher weeds and spins when prompted
Software Testing (early spring):
Weed detection: Ensure the robot accurately identifies weeds
Communication: Ensure fast and reliable communication between the robot and the Cloud
Navigation and obstacle avoidance: Ensure the robot can navigate around obstacles and to weeds using simulations
System Integration Testing (mid to late spring):
Navigation and obstacle avoidance: Ensure the robot can navigate through the rough terrain towards intended targets
Weed killing: Ensure the robot can detect weeds, alter its route to drive towards the weeds, and kill the weeds with the weed whacker