10 May 2019
A quadrotor drone equipped with sensors including a camera, height sensor, and IMU runs the Extended Kalm Filter (EKF) algorithm to fuse sensor data for autonomous navigation. The pose and velocity output from vision algorithms including PnP and optical flow is fused with the IMU and height data. In addition, planning algorithms including A* Search algorithm and a polynomial trajectory generator was implemented. The system pipeline is illustrated below.
System Pipeline
PnP is the problem of estimating the pose of a calibrated camera given a set of n 3D points in the world and their corresponding 2D projections in the image. The tags (that look like QR codes) on the floor are prior knowledge with information on the pose of the corresponding points with respect to the world frame. The solvePnP() function in the OpenCV vision library was utilized.
The Lucas-Kanade (LK) optical flow algorithm was implemented as a velocity estimator. The LK optical flow algorithm was implemented through the calcOpticalFlowPyrLK() function provided by OpenCV. The image feature points were sampled uniformly with 20x20=400 feature points. The corresponding output points from the LK optical flow algorithm was then filtered in a three-step procedure:
1. Filter the feature points from the output status of the LK optical flow algorithm.
2. Filter the feature points by using Random Sampling Consensus (RANSAC).
3. Filter the feature points with a pixel distance of greater than one between corresponding points.
After the three-step filter process, the average pixel velocity was obtained by taking the sum of the pixel vectors and dividing by the time difference. The average pixel velocity was then converted to the camera velocity by the relation below, where the left hand side is the pixel velocity and the right most vector is the camera velocity. By the small motion assumption, we set v_z to 0 which allows us to solve for two unknowns (v_x and v_y) with two equations.
*Note: f_u and f_v are intrinsic camera parameters.
The implementation was conducted in two different stages. The first stage was to implement the EKF to fuse the IMU data and PnP pose from the tag detector. The second stage was to build on top of the first stage by fusing the optical flow velocity and height information as well. The logic used in the final implementation is as follows:
1. If there is no velocity information from the optical flow, EKF output is from IMU process (i.e. acceleration integration).
2. If there is no pose information from the tag detector, fuse the height and velocity from the optical flow with IMU measurement.
3. If there is pose information from tag detector, fuse pose (tag detector), velocity (optical flow), and IMU measurement.
The clip on the left displays the estimated pose from the EKF (Red) and the ground truth pose from VICON (Blue).
The path planner and trajectory generator was implemented as online algorithms. The user inputs an array of vectors representing start point, end point, and obstacle coordinates, and outputs a set of waypoints as the path for the robot to go through. The A* Search algorithm then passes its output of waypoints to the trajectory generator. The trajectory generator was implemented in the form of 5th-order polynomial trajectory, and outputs smooth position and velocity commands online to the controller. The figure on the right shows the simulation of the output, where the blue circle is the start point, green circle is the target, and the red circles are the obstacles. The drone trajectory is shown as the continuous axes.