Terrain-Aware System Identification for Autonomous Navigation of Wheeled Robots
Terrain-Aware System Identification for Autonomous Navigation of Wheeled Robots
Path planning is a well-known problem in the field of robotics. Autonomous vehicles constantly face the issue of navigating through novel scenarios, and while roads remain fairly consistent, obstacles that come up change constantly. Other planning problems can take place in new environments. In search and rescue scenarios, different regions might end up more navigable than others, and a robot would have to determine the best course of action to avoid obstacles, conquer rough terrain, and reach its destination.
Navigating the real world often takes a lot of guesswork. In mission critical scenarios, minimizing the uncertainty of a chosen path is crucial to safety. A poor navigational decision can cripple a robot, so it must avoid areas that might be seen as more dangerous. While no system is born with knowledge about safety and drivability, making such an identification and learning process autonomous can improve performance during eventual deployment.
Our goal is to create a system that performs this kind of terrain-aware navigation. Using a Turtlebot and its camera, we create a map of the environment as we navigate. Simultaneously, a feature vector is updated with information about the regions the robot has taken an image of so as to inform the control algorithm about the terrains present. As the robot physically drives over different areas, data about aleatoric (environmental) and epistemic (model-based) uncertainties are recorded to inform the path planner. A voxel grid is constantly updated with edge weights corresponding to these uncertainties, and the planner can find optimal paths through it. Consequently, we can go through regions with different terrains while choosing the most navigable route.
TASI (Terrain-Aware System Identification) seeks to help robots autonomously learn to differentiate between drivable and undrivable regions and find an optimal path based on just odometry and visual inputs. This project implements TASI using feedback from Turtlebots to determine the drivability characteristics of different regions. The dynamics of the environment the robot is navigating through are estimated using kernel regression while accounting for model-based (epistemic) and environmental (aleatoric) uncertainties. A map is created simultaneously by stitching together camera images transformed with a homography matrix and adjusted based on the odometry of the Turtlebot. A feature map is created in conjunction with this based on visual aspects of the different images to inform the control algorithm about the kinds of terrains present. Path planning can navigate between points using Dijkstra’s algorithm, which would tend to avoid more undrivable regions. Results show that the robot can identify and avoid problematic terrains in simulation consistently, and preliminary real-world tests show similar results in identification of terrains, though the non-idealities of physical systems complicate the problem significantly.