2019 - AutoNavi3AT, Software Tool to Follow an Urban Road Using a Mobile Robot and Omnidirectional Vision
AutoNavi3AT is a software tool designed and implemented to allow a mobile robot to navigate autonomously along urban roads by the use of omnidirectional vision. AutoNavi3AT uses following hardware configuration: on board Mini-computer, wireless communication, catadioptric omnidirectional camera, laser range finder, and a Pioneer 3AT mobile robot.
AutoNavi3AT allows users to manage image processing, prediction and estimation of vanishing points, obstacle evasion, capture user events and automatic robot heading calculations.
The results show that omnidirectional vision has fundamental advantages over other types of computer vision by not requiring additional hardware to move the camera around, and providing the robot with a greater amount of useful data of the environment.
Actually, AutoNavi3AT is a registered software legally recognized by the Interior Ministry of Colombia. This document can be downloaded here.
AutoNavi3AT was developed in two parts: one is running in the mobile robot and it uses ROS (Robotic Operating System); the second one is running in an Android mobile device to show the GUI to users.
The AutoNavi3AT ROS software node structure is shown in the figure above. In this diagram, the following modules can be observed:
1. Image capture (light yellow).
2. User event capture (grey).
3. ROS base utilities: websocket support and ROSAria (pink).
4. Vanish point prediction (blue).
5. Vanish point estimation (purple). Both, the vanishing point prediction and estimation are based on a Particle Filter framework developed particularly for this application.
6. Obstacle avoidance (green).
7. Robot heading (yellow).
It is worth noting that AutoNavi3T was developed considering the following hardware: Mobile Robot Pioneer3AT, on board computer running Ubuntu 16.04, ROS Indigo, Hokuyo UTM-40LX laser range finder and a catadioptric omnidirectional camera.
Once the AutoNavi3AT robot software is running, the user can interact with the mobile robot through an Android-based mobile device.
The AutoNavi3AT mobile app is structured in the following activities:
1. The main activity where icons are presented to user in order to navigate to the robot settings, AutoNavi3AT parameters and mobile robot control.
2. The robot settings activity allows users do the following task:
a. The translation mobile robot speed.
b. The maximum range of the laser sensor.
c. The minimum range of the laser sensor, which means an imminent collision.
d. The omnidirectional camera settings.
e. A set of buttons to turn back, save settings in the robot, restore settings to default values, and a next button to configure the AutoNavi3AT parameters of navigation.
3. The AutoNavi3AT navigation parameters activity allows users setting up the vanishing point properties as follows:
a. The number of particles in the Rao-Blackwellized Particle Filter.
b. Gabor filter parameters
4. Mobile robot motion control activity, which allows users to drive the mobile robot in the moment to face a U-turn, or intersection. This activity has the following functionalities:
a. Continue the forward motion.
b. Turn left the mobile robot.
c. Turn right the mobile robot.
d. Perform an U-turn.
AutoNavi3AT was tested using a dataset captured in the Universidad del Valle internal roads, and in real time. Using the datasets captured, it can be measured the improvement in the vanishing point estimation compared with the vanishing point prediction.
One of these routes captured is shown in the figure at right. It shows:
1. The GoogleMaps view and the path traveled by the robot in red.
2. The green point is the starting point.
3. Below this map, it can be observed one of the panoramic images captured by the robot, as well as the predicted vanishing points and the estimated vanishing point.
4. Below the panoramic image is shown the X and Y pixel coordinates of the vanishing points predicted and estimated. The predicted vanishing point coordinates are painted in magenta, and the estimated vanishing points coordinates are painted in blue.
The following video shows how AutoNavi3AT works:
This software was developed for industrial service and academic use. If you like to perform a field test, please contact me:
Prof. Bladimir Bacca Cortes Ph.D.
Address: Cra. 100, Street 13, Universidad del Valle, Melendez, Building 354, Office 2006.
Tel: +5723212100 Ext. 7656