Quadruped robots are currently a widespread platform for robotics research, thanks to powerful controllers based on Reinforcement Learning and the availability of affordable and robust commercial platforms.
However, to broaden the adoption of the technology in the real world, we require robust navigation stacks relying only on low-cost sensors such as depth cameras.
This paper presents a first step towards a robust localization, mapping, and navigation system for low-cost quadruped robots. We combine contact-aided kinematic, visual-inertial odometry, and depth-stabilized vision, enhancing stability and accuracy of the system.
Our results in simulation and two different real-world quadruped platforms show that our system can generate an accurate 2d map of the environment, robustly localize itself, and navigate autonomously.
Our system leverages data from low-cost sensors—including an RGB-D camera, an IMU, and joint encoders—augmented by scan stabilization (SS) and leg odometry (LO) modules to achieve accurate localization and mapping. These modules are essential for quadruped robots equipped with reactive, high-speed controllers to effectively navigate complex 2D environments.
The LO module requires contact sensors mounted on the legs of the robot. However, for platforms lacking these sensors, we provide a dedicated contact estimation module.
Map creation with the Silver Bagdger can be visualized below in the AWS warehouse with (left) and without (right) Scan Stabilization.
We use kinematic leg odometry for (i) resetting the Visual Odometry when it loses feature tracking (ii) to add a velocity constraint to the SLAM factor graph as seen in the figure below
We evaluate our system's ability to localize itself on two different environments accross 5 runs each. We use RTAB-Map [1] for the SLAM front-end and the ROS Slam Toolbox [2] for the slam back-end. Below is a table consisting of the evaluation metrics for each independent run.
This figure depicts the trajectories of run-1 of the Silver Badger in the AWS Warehouse with and without components
It's clear that both scan stabilization and leg odometry is crucial to be able to precisely localize and reconstruct accurate maps.
We test our system on three other robots in simulation: The Honey Badger (Silver Badger without spine joint), Unitree A1, and Unitree Go1. Below are the trajectories for one run of each robot in each of the AWS environments
Honey Badger
Honey Badger
Unitree A1
Unitree A1
Unitree Go1
Unitree Go1
Honey Badger
Honey Badger
Unitree A1
Unitree A1
Unitree Go1
Unitree Go1
Below are the maps from created while running our system on both simulated environments.
We evaluate the robot's navigation capabilities on a pre-mapped environment by sequentially sending 5 different poses to reach. Our system reliably attains every goal with minimal latency, whereas the baseline only reaches a subset of the targets—and then at substantially longer times.
Unitree Go2
Silver Badger
Unitree Go2
Silver Badger
This section illustrates the robots ability to autonomously explore an unknown environment while maintaining accurate localization and generating a high quality map that covers above 90% of the area. We use frontier-based exploration strategy using a nearest-frontier method [3].
Below is a depiction of all 4 trajectories, each with different system components, for one of the runs of the Silver Badger in the IAS and IRIM labs. It can be seen that the combination of leg odometry and scan stabilization is crucial to achieving accurate localization.
Our system is able to accurately localize itself in the real world while at the same time being able to generate high-quality 2D maps even in cluttered environments with noisy sensors
Baseline
In clutter free environments, our system is able to generate a neat and highly precise 2D map, while localizing itself accurately.
Baseline
[1] M. Labbé and F. Michaud, “Rtab-map as an open-source lidar and visual simultaneous localization and mapping library for large-scale and long-term online operation,” Journal of field robotics, vol. 36, no. 2, pp. 416–446, 2019.
[2] S. Macenski and I. Jambrecic, “Slam toolbox: Slam for the dynamic world,” Journal of Open Source Software, 2021.
[3] B. Yamauchi, “A frontier-based approach for autonomous exploration,” in Proc. CIRA, 1997, pp. 146–151.