Our original project was to design a fixed wing system that could use computer vision and depth perception to avoid vehicles. However, what we realized early on was that depth perception would be too hard since at the speed our planes were moving we would need really expensive sensors to detect objects at a reasonable distance away.
We chose to cut scope and have a system that instead would be able to identify a tarp or object of interest and drop a payload on it. Our finished solution ended up being autonomous with the inclusion of the PX4 autopilot system, it was able to detect our tarp using OpenCV and the inclusion of the Arducam, and it was able to drop a payload with our own system that used a servo to release a small payload.
We fulfilled all of our design criteria with the exception of autonomous landing. However, there are still issues related to the autopilot and object detection that prevents our system from working consistently.
First autonomous flight
Our airdrop accuracy trial results on final flight day
Plane #1 nosediving into turf
The aftermath of plane #1 nosediving into turf
On the software side there were many issues related to compatibility. Because QGroundControl, PX4, Gazebo, and ROS2 are all made independently, there are issues with what Linux Distro you should be using and which toolchains you need to use. This made the learning curve really steep when trying to run flight sims in Gazebo.
A big portion of this project was also building a plane since we needed to replace the flight controller. None of our group members had experience using the Pixhawk so there was a ton of trial and error that went into tinkering with the PX4 firmware to tune the roll, yaw, pitch, etc. to ensure we had a flyable plane with stabilization.
Initially we tried flying without features like stabilization which was a nightmare to fly. While adding flight stabilization there was a bunch of tweaking we had to do to make sure that our plane wasn’t overcorrecting. As a result, there were some instances initially where our plane would porpoise in the air since our P values were tuned too high.
Hardware-wise, there was also a bunch of tinkering with the stock ESC to make sure it would arm properly and due to some compatibility issues between the stock Spektrum receiver and the PX4 firmware, we had to replace the receiver with an FRSky D4R-II receiver and a Taranis transceiver.
Our PX4 required a terrain detection sensor for autonomous landing, which we lacked. As a result, we are currently flying autonomously but once our plane enters the landing glide slope, it requires manual intervention to land otherwise it will abort the landing and go around.
Our autopilot requires very specific conditions for it to work properly. We noticed that with any significant wind, and different lighting conditions, our plane would struggle to reach its defined way points and would be unable to identify the tarp properly since its colors changed.
Our computer vision component currently only relies on finely tuned HSV values to identify the tarp. As a result, if any of the lighting conditions change even slightly from the direct sunlight of winter months from 12pm to 4pm, the object detection starts to get very hit or miss.
Raw Arducam image showing color distortion
If we had additional time, we could add more filters and post-processing for our computer vision system. We attempted to apply a Gaussian blur in order to reduce other noise in the images we got (which led to false positives), but we weren’t able to properly tune that all the way. Furthermore, it would be worth looking into contour detection in order to properly identify the rectangular shape of the tarp since most dirt patches are splotches on the ground.
Originally, we believed that our plane would be too slow and not very maneuverable to do much obstacle avoidance. However, this was only true with the original flight controller which limited flap movement. With our Pixhawk 4 we could have added more robust flight controls beyond basic point A to point B flight.