Designing a Distributed Multi-UAV Platform with On-Board Intelligence
INTRODUCTION
Unmanned Aerial Vehicles (UAVs), commonly known as autonomous drones, are aircraft systems that can be remotely controlled by a human operator or independently guided by onboard computers. Many control systems choose to gather a group of drones around one device which can place significant pressure and decision-making burden on the host. This setup can create a single point where things might go wrong if that device fails, and it also needs a lot of computing power to work properly. This research project will explore the possibility of distributing control amongst a network of drones which will have the capability to make its own decisions independently and collaborate with others to efficiently accomplish tasks without interference. The subject drones will gather local data using sensors and communicate with each other to determine the best actions and movements needed to achieve their objectives effectively. This is accomplished by first engineering smart drones with computational capacity for independent decision-making, infusing autonomous navigation algorithms and wirelessly communicating with each other in order to effectively coordinate actions. Wireless communication between the drones can be achieved by using the Wifi module provided by the on-board processor (Arduino Gigaboard R1) and integrating it with a flight controller (Pixhawk 6C). This completely eliminates the need of a ground control station as the drone will be embedded with self-navigational algorithms and on-board decision-making intelligence through which the family of drones can exchange data and come to a conclusion.
Mentor - Dr. Ran Zhang (rzhang8@charlotte.edu)
Location - SmartNet Labaratory, EPIC 2378, UNC Charlotte
The first UAV has been engineered with a Holybro X500 V4 frame, 1700 KV brushless DC motors aided by a 14 V 8000 mAh LiPo Battery. I successfully executed an autonomous flight of a UAV using the guided mode of the Pixhawk 6C via the MAVLink interface. By leveraging the capabilities of the Pixhawk 6C, I was able to precisely control the UAV's flight path, ensuring accurate waypoint navigation and mission execution. The MAVLink protocol facilitated seamless communication between the UAV and the gigaboard R1 where the navigational algorithms come into play, allowing for real-time adjustments and monitoring throughout the flight. The UAV can also exhibit other modes such as position hold, altitude hold, Return to launch, and many more.
The commands to pixhawk 6C is sent by the gigaboard through a set of MAVLink commands forwarded as heartbeat messages through a buffer list of data streams. The gigaboard also has sevreal tested and pre-defined mapping techniques that survey the areas as grids and several other methods. A pixy2 camera has also been integrated with the gigaboard R1 for object identification and then followed by a set of intelligence algorithms. The Pixy2 camera board can independently process captured images when it detects a block, thereby reducing the computational load on the Arduino.
Inter-drone communication is facilitated by Arduino Giga R1 boards, which establish a direct local communication network without relying on an internet connection. This system enables the exchange of information via UDP using a custom-developed GigaComms library that supports both asynchronous and synchronous data transmission. The integration with MavLink, a widely used protocol for drone communication, allows the retrieval of critical flight data such as battery voltage, altitude, and GPS coordinates, as well as controlling drone navigation by setting waypoints. Additionally, coupling the Giga R1 boards with Pixy2 cameras enhances the drones' ability to make autonomous navigation decisions based on visual data, further extending the system's capabilities. This comprehensive approach ensures that the drones not only communicate effectively with each other but also respond dynamically to their environment, making the system highly adaptive and versatile for complex missions.
I designed and implemented a multi-robot coordination system using the Robot Operating System (ROS), focused on autonomous path planning, target searching, and real-time collaboration in complex environments.
Path Planning & Autonomy: Developed algorithms that enabled UAVs to navigate and search targets efficiently while avoiding obstacles.
Decentralized Communication: Implemented and tested protocols for UAVs to share live position, odometry, and sensor data, allowing cooperative decision-making without relying on a centralized controller.
Simulation & Visualization: Validated system performance in Gazebo simulations with RViz visualization, ensuring accurate representation of multi-robot interactions.
Computer Vision Integration: Enhanced the UAV perception system by integrating a custom fine-tuned YOLOv12 object detection model into the ROS pipeline, enabling real-time object recognition and target tracking.