Simultaneous Localization and Mapping (SLAM)

Development of a test-bed for vision-based Simultaneous Localization and Mapping (SLAM)

The system that developed is constructed using both commercially off-the-shelf parts and custom 3D printed components that adapt it to our purposes.

  • First, we added prop guards for the large propellers using PETG which is a strong and flexible plastic.
  • Next, we added a larger canopy that can hold our desired onboard computer, an intel NUC (next unit of computing).
    • The intel NUC is very powerful with a core i7 processor equivalent to the chip in state-of-the-art laptops. This allows rapid development as any software that can be used on a laptop can be directly used on the drone without embedded development.
    • The intel NUC also has USB ports that can be used to attach cameras and 3D sensors such as the intel realsense and orbbec astra.
      • The intel realsense used stereo vision IR and the orbbec astra used an IR projector and structured light. These sensors coupled with the sensors in the pixhawk make this an ideal vision based SLAM platform.
  • For ground truth, we use an optitrack IR motion capture system. This system provides position and attitude data for the vehicle at 100 Hz.
  • A windows based computer running the motion capture software send network packets that are received by ROS (robot operating system) clients. This data from the SLAM algorithm and the ground truth data from opti-track can then be logged and compared using logging tools in ROS.

State-of-the-art sensors

Opti-track Motion Capture

Invariant Extended Kalman Filter (IEKF) Application to Optical Flow Based Visual Odometry for UAVs

We have been investigating the performance benefits of using an Invariant Extended Kalman Filter for optical flow based navigation. For embedded systems like the PX4 autopilot, processing power is constrained. It is often not feasible to run the covariance prediction calculation at the rate at which measurements are received. By employing invariant theory, the estimator equations can be designed such that the evolution of the covariance is more autonomous (only dependent on time).

Optical Flow Based Visual Odometry for UAVs

In the figures below, you can see that in a stand Extended Kalman Filter the covariance fluctuates rapidly as the vehicle executes a small quadrilateral shaped mission. This is because the covariance is formulated in the body frames, in the invariant kalman filter the attitude covariance is expressed in the world frame to maximize invariance.

In the figures below, you can observe that the evolution of the covariance for the IEKF is smooth and does not vary significantly as the vehicle flies the same mission. Also observe that the position estimation performance is also improved by maximizing the invariance and hence autonomous nature of the error dynamics, which means the time constants for changes in the error dynamics are slower and require less frequent update steps.