The long-term SLAM (Simultaneous Localization And Mapping) dataset was conceived to test SLAM algorithms in realistic environments. This include: changes in illumination due day time and season of the year, occlusions due presence of pedestrians, and presence of structural changes in the environment. The dataset was collected in the PIV building of the Universitat de Girona (Spain). This building has 3 floors, ground, first and second floor respectively. Each floor was covered at different day times (morning, afternoon and night), and at different seasons of the year (autumn, winter, spring and summer). Also, a long acquisition process was performed combining the three floors. To do so, the mobile robot starts at ground floor, then using an elevator the mobile robot continues to the second and third floor, finally the mobile robot returns to the start position at ground floor.
All runs were performed using the same mobile robot platform and set of sensors as depicted in the left Figure. The mobile robot used was a Pioneer 3DX equipped with an on-board computer at 1.5Ghz, running Linux Ubuntu 12.04, an omnidirectional vision setup composed of a RemoteReality parabolic mirror with a diameter of 74mm, an UI-2230E-C camera with a resolution of 1024x768 pixels, and a Hokuyo URG-04LX laser range finder. In addition, the mobile robot odometry was also captured, as well as the sonars data.
All runs were performed using the same mobile robot platform and set of sensors as depicted in the left Figure. The mobile robot used was a Pioneer 3DX equipped with an on-board computer at 1.5Ghz, running Linux Ubuntu 12.04, an omnidirectional vision setup composed of a RemoteReality parabolic mirror with a diameter of 74mm, an UI-2230E-C camera with a resolution of 1024x768 pixels, and a Hokuyo URG-04LX laser range finder. In addition, the mobile robot odometry was also captured, as well as the sonars data.
Figure also shows the environmental conditions in which the dataset was collected. For instance, each row corresponds to different places at each floor, and different day times. From these omnidirectional images, it can be observed that there are illumination changes and occlusions caused by pedestrians ensuring in this way real-world experimental conditions. Then, for this reason this dataset could be used as a test bed for SLAM algorithms which deal with long-term or life-long operation.
A description and the downloadable files of the dataset for each floor are presented at each link as follows:
Since the sensor setup is composed by an omnidirectional camera and a laser range finder, users could require the extrinsic calibration between these sensors, and the omnidirectional camera calibration files which can be requested to download by email.
If you use the Long-Term SLAM dataset in your scientific work, please cite this paper:
@article{Bacca20131539,
title = "Long-term mapping and localization using feature stability histograms ",
journal = "Robotics and Autonomous Systems ",
volume = "61",
number = "12",
pages = "1539 - 1558",
year = "2013",
note = "",
issn = "0921-8890",
doi = "http://dx.doi.org/10.1016/j.robot.2013.07.003",
url = "http://www.sciencedirect.com/science/article/pii/S092188901300122X",
author = "B. Bacca and J. Salvi and X. Cufí",
keywords = "Long-term localization and mapping",
keywords = "Laser range finder",
keywords = "Particle filters",
keywords = "Omnidirectional vision "}
Technical details of this dataset are explained below. The mobile robot used was the Pioneer 3DX with the following specifications:
Computer – On board PC computer at 1.5Ghz running Linux Ubuntu 10.0
On Board Sensors – 8 ultrasound sensors (4 rear and 4 front) and motors with 1/500 tick encoders.
Mechanics – Two drive wheels in differential drive arrangement and one rear caster wheel.
Communications – Wireless Ethernet 802.11
Power supply – 3 batteries of 12V providing an autonomy of 3hours approximately.
Dimensions – Width 38cm, height 21.5cm, depth 45cm and weight 9kgr.
Other technical data – Payload 17Kgr, maximum velocity 5.76 km/h.
The mobile robot also has an omnidirectional camera with the following properties:
Optics – RemoteReality.
Mirror diameter – 74mm.
Camera - UI-2230SE-C camera with a resolution of 1024x768 pixels. Data interface USB.
FOV – Vertical 90º (15º up and 75º down), horizontal 360º.
The Laser Range Finder (LRF) placed in front of the mobile robot has the following specifications:
Range – 0.02m to 4m.
Operation – Indoors.
Scan angle – 240º.
Scan time – 100ms.
Resolution – 1mm (0.02m to 1m) and 1% between 1m and 4m.
Interface – Data RS232, power supply using USB ports.
Filenames and data format – The data sequence is presented in MAT files (dataLaser.mat, dataOdometry.mat, dataOmni.mat, dataSonar.mat), but the original data sequence is stored in the ‘dataTXT.zip’ file. In addition, the ‘imgData’ stores the omnidirectional image sequence. The internal formats of each data file is presented below:
Odometry data format – Odometry data is stored in a matrix of Nx6, where N is the number of sequences. Each row is formatted as follows:
timestamp Xr Yr ThetaR V W
where, timestamp is the number of milliseconds elapsed from the acquisition started; Xr, Yr and ThetaR are the current robot position variables; V and W are the robot linear and angular velocity.
Laser scan data format – Laser scans are stored in a matrix of Nx1367, where N is the number of sequences. Each row is formatted as follows:
timestamp scan-size range-value theta-value …
where, scan-size is the number of range/theta values in the current row; range-value is the range value measured at the orientation given by theta-value. Range-value is measured in m and theta-value is measured in degrees.
Sonar data format – Sonar data is stored in a matrix of Nx33. Each row is formatted as follows:
timestamp range-value theta-value …
where, range-value is the range value measured at the orientation given by theta-value. There are sixteen pairs of range/theta values corresponding to the sixteen sonar sensors installed in the Pioneer 3DX.
Omnidirectional image data format – Omnidirectional image name data is stored in a matrix of Nx2. Each row is formatted as follows:
timestamp image-filename
where, image-filename is the image filename which has the following template: imgData/img_XXX.png, XXX means the image consecutive number. All the omnidirectional images are stored in the relative sub-directory ‘imgData’.
If you have questions, please send an email to Bladimir Bacca Cortes. Finally, this work has been partially supported by the project RAIMON—Autonomous Underwater Robot for Marine Fish Farms Inspection and Monitoring (Ref. CTM2011-29691-C02-02) funded by the Spanish Ministry of Science and Innovation, the LASPAU-COLCIENCIAS grant 136-2008, the University of Valle contract 644-19-04-95, and the consolidated research group’s grant SGR2009-00380.