I developed a system identification software for a Betaflight racing drone. The software operates during a manual drone flight, utilizing Vicon motion capture to collect the drone's position, orientation, joystick commands, and frames from the onboard camera. Using Mujoco, I implemented a program to solve a finite-horizon nonlinear least squares problem. This program processes live drone data to identify an inner control loop, replicating the drone's behavior in simulation.
The graph on the left demonstrates the output of the system identification. The blue line labeled live is the angular velocities from the manual flight. The green line labeled radio is the requested angular velocity from the radio commands being sent to the drone, and the orange line labeled sim is the angular velocities that we achieved in simulation given the same radio commands being sent in to the inner loop we identified for simulation. The bottom the bottom follows the same format for linear acceleration with an additional red line to demonstrate the overall thrust of the drone.
I designed and implemented a real-time, multi-processed Python framework for autonomous quad-rotor control, featuring path planning, state estimation (Luenberger Observer), and control (LQR). This framework enabled the drone to perform a figure-eight flight path in both simulation and real-world conditions as demonstrated above.
The LQR parameters for the position controller, based on differential flatness, were automatically tuned using the identified inner-loop quadrotor model. Using positional data from the Vicon motion capture system, the live drone successfully executed the same flight path as in simulation, leveraging identical LQR control logic.
In Fall 2024, I will integrate the onboard camera to enable embodied state estimation. This effort builds upon PhD candidate Levi Burner's work on Visuomotor Embodiment. Currently, the drone can recognize an AprilTag, as shown in the images to the left.
The next steps include implementing state estimations derived from the AprilTag to enable fully autonomous flight with onboard vision. Beyond this, the project will focus on having the drone identify and fly through a hula hoop, advancing its ability to navigate using only the camera and an embodied framework.