We utilize a quadrupedal robot (Unitree Go1 & Go2) as a base platform to collect the dataset and set up the compact sensor configuration.
Using a top-mounted intel NUC board, each sensor data was obtained asynchronously.
To capture visual measurements with different wavelengths, the RGB and thermal cameras are placed to form a forward-looking sensor system.
The terrain-looking sensor system uses an RGB-D camera and LiDAR to obtain sufficient ground information.
In addition, LiDAR enables the recognition of the surrounding environment regardless of angle.
Finally, IMU, and built-in state publisher are responsible for the robot's Navigation.
Kinematic frame of each robot are provided based on the URDF information provided by Unitree
Each coarse extrinsic parameter are provided using CAD model. The following parameters are utilized as an initial guess during extrinsic calibration step.
We utilized identical calibration step used from our previous work, single-session dataset DiTer.
We provide our datasets in rosbag format acquired in ROS noetic.
Each perceptual sensor and motion sensor are represented in this table.
Specification
Robot State
Each state of Robot are published in a single custom message format which includes robot's low-state information. Joint-state, built-in IMU, foot contact are included in the message.
Custom ROS message package is necessarily required to recognize and subscribe the following state. Required package is uploaded at our Github link.
Thermal Data
For those who extensively plans to utilize thermal image, thermal image are provided in 14-bit format.
With the comprehensive multi-modal setup of our agents, visualization below includes :
Inertial / Kinematic measurement
Range measurement
Thermal / RGB image
Geometric / Photometric terrain measurement