Visual-Inertial SLAM in Closed-Loop Navigation

A autonomous navigation stack with low-latency design is developed, in cooperation with 3 PhD students and 2 master students.

The perception module includes the low-latency VSLAM (good feature matching + hashing-based map indexing) and the EKF-based multi-sensor fusion framework eth_msf. The state estimation from VSLAM is fed into feedback PID control.

The local planning is conducted with Planning-in-Perception-Space (PiPS), a low-latency local planning framework from IVALab.

Demonstrated the applicability of developed navigation system via simulation and real robot deployment, with the capability of goal-oriented fast navigation and collision avoidance.

The Gazebo/ROS closed-loop benchmarking system is open-sourced: https://github.com/ivalab/meta_ClosedLoopBench

Figures generated using full evaluation results can be accessed at: https://github.com/ivalab/FullResults_ClosedNav

Referred publications:

  • Closed-Loop Benchmarking of Stereo Visual-Inertial SLAM Systems: Understanding the Impact of Drift and Latency on Tracking Accuracy

Y. Zhao, J. Smith, S. Karumanchi, P. Vela, IEEE Int. Conf. on Robotics and Automation (ICRA), 2020, (paper).

  • Characterizing SLAM Benchmarks and Methods for the Robust Perception Age

W. Ye, Y. Zhao, P. Vela, SLAM Benchmarking Workshop, IEEE Int. Conf. on Robotics and Automation (ICRA), 2019. (paper, supplementary materials)