Mobile robot navigation is typically regarded as a geometric problem, in which the robot’s objective is to perceive the geometry of the environment in order to plan collision-free paths towards a desired goal.
However, a purely geometric view of the world can can be insufficient for many navigation problems.
For example, a robot navigating based on geometry may get trapped in a field of tall grass because it believes it is untraversable.
We investigate how to move beyond these purely geometric-based approaches using a method that learns about physical navigational affordances from experience.
BADGR is an end-to-end learning-based mobile robot navigation system that can be trained with self-supervised off-policy data gathered in real-world environments, without any simulation or human supervision.
1. Autonomously collecting data
2. Autonomously labeling the data using self-supervision
3. Training an image-based neural network predictive model
navigate in urban environments
navigate in off-road environments
improve as it gathers more data
generalize to never-before-seen environments