Due to the nature of the environments where we captured the EnvoDat dataset, we used two robot platforms. For indoor environments with structured layouts, we utilised our open-source wheeled robot (ROMR). However, for off-road and outdoor environments with unstructured terrains, we utilised our Unitree Go1 quadruped robot. Our sensor setup on the robots is illustrated in the figure below.
We employed a similar frame of reference convention as KITTI.
Cameras:
z-coordinates facing forward,
y-coordinates facing downward, and
x-coordinates facing left.
LiDAR:
z-coordinates facing upward,
x-coordinates facing forward, and
y-coordinates facing left.
We calibrated with a 300 x 390 mm chessboard with 15 x 10 internal corners of 21 mm square length. During calibration, 20 image samples were captured at approximate left-right rotation angles of 0°, 45°, 90°, 135°, and 180°, and front-back bending angles of 30°, from distances between 1 to 3 meters. We intentionally did not exceed this range since the monocular cameras for example have a very low resolution, which when exceeded may lead to inaccurate detection of the chessboard corners. We obtained the intrinsic and extrinsic parameters of the sensors using the following tools:
Cameras:
We used the Kalibr (https://github.com/ethz-asl/kalibr) target-based calibration pipeline and the
MATLAB calibration App (https://uk.mathworks.com/help/vision/camera-calibration.html).
Camera-LiDAR:
MATLAB calibration App (https://uk.mathworks.com/help/vision/camera-calibration.html).
Kalibr (https://github.com/ethz-asl/kalibr).
IMU:
Allan Variance (https://github.com/ori-drs/allan_variance_ros) is used to compute the angle, velocity random walks as well as the bias instabilities of the accelerometer and the gyroscope.
Camera-IMU:
We utilised the Kalibr (https://github.com/ethz-asl/kalibr) visual-inertial calibration toolbox to perform spatial and temporal calibration of the IMU with our ELP 4K Megapixel monocular camera.