Outdoor mobile robot navigation in real world

Tsukuba Challenge

Tsukuba Challenge (TC) is an annual competition of autonomous mobile robot navigation and is called Real World Robot Challenge because it is held in a real city environment. In TC, several missions are provided, e.g., autonomous navigation over 1 km, taking an elevator, and finding target persons. I participated in TC from 2011 to 2014 as a main developer and completed all of the missions. Through TC, I developed many modules in particular the localization and target person detection modules.

Localization using magnetic sensor and LiDAR

I developed a localization method that uses a magnetic sensor and LiDAR. Geometric and magnetic maps are used in the method. The geometric map represents environmental shape and is used to localize a robot pose by matching LiDAR measurements. The LiDAR-based localization enables the robot to accurately estimate its pose; however, there are some areas where there are no effective geometric objects for the localization, e.g., crowded environment and open space. The magnetic map stores magnetic azimuth angles and is used to compensate an orientation error. Estimating accurate position based on the magnetic map is difficult; however, it enables the robot to compensate the orientation error in areas where the LiDAR-based localization does not works. Owing to use of both the geometric and magnetic maps, the robot can navigation various areas. The localization method had significant contribution for the mission complete of TC.

The bottom figure illustrates a conceptual flow of the localization method. The localization method is implemented with a particle filter. First, the particles are updated based on a robot's motion model. Then, heading direction of the particles are compensated based on the magnetic map if the matching with the magnetic map is succeeded. Finally, LiDAR measurement-based likelihood calculation is performed and the robot pose is estimated.

Target person detection based on its color layout

The right figures show the target person (URL). I developed a camera-based target person detection method because the target person wears colored clothes. The method is quite simple. First, region of interest (ROI) on the image is estimated using LiDAR (the robot equipped with the 2D LiDAR). Then, regions of the cap, vest, and blouson are estimated and orange, blue, and green colors are extracted from the regions using XY color coordinates. Finally, color layout is evaluated and distinguish whether the ROI contains the target person or not.

Publications

My main research contribution through TC is development of the localization method using magnetic sensor and LiDAR. I presented the method in [1] and won the Best Student Paper Award. Additionally, I wrote a journal paper [2] that shows effective performance of the localization method trough TC. The person detection method is presented in [3] that is joint work with Kenji Yamauchi. We also participated in TC 2014 with a newly developed robot, named SARA, and completed the mission. This year's work is summarized in [4].

[1] Naoki Akai, Satoshi Hoshino, Kazumichi Inoue, and Koichi Ozaki. "Monte Carlo localization using magnetic sensor and LIDAR for real world navigation," In Proceedings of the IEEE/SICE International Symposium on System Integration (SII), pp. 682-687, 2013. (ResearchGate)

[2] Naoki Akai, Kazumichi Inoue, and Koichi Ozaki. "Autonomous navigation based on magnetic and geometric landmarks on environmental structure in real world," Journal of Robotics and Mechatronics (JRM), vol. 26, no. 2, pp. 158-165, 2014. (ResearchGate)

[3] Kenji Yamauchi, Naoki Akai, Ryutaro Unai, Kazumichi Inoue, and Koichi Ozaki. "Person detection method based on color layout in real world robot challenge 2013," Journal of Robotics and Mechatronics (JRM), vol. 26, no. 2, pp. 151-157, 2014. (ResearchGate)

[4] Naoki Akai, Kenji Yamauchi, Kazumichi Inoue, Yasunari Kakigi, Yuki Abe, and Koichi Ozaki. "Development of mobile robot 'SARA' that completed mission in real world robot challenge 2014," Journal of Robotics and Mechatronics (JRM), vol. 27, no. 4, pp. 327-336, 2015. (ResearchGate)