Camera - LiDAR Extrinsic
Matlab camera lidar calibration
https://kr.mathworks.com/help/lidar/ug/get-started-lidar-camera-calibrator.html
Right Camera - Light Camera Extrinsic
Matlab stereo camera calibration
https://kr.mathworks.com/help/vision/ug/using-the-stereo-camera-calibrator-app.html
Camera Intrinsic
Matlab camera intrinsic calibration
https://kr.mathworks.com/help/vision/ref/cameraintrinsics.html
Left camera - LiDAR projection (Parking Lot)
Right camera - LiDAR projection (Mountain)
4D Radar - LiDAR Extrinsic
We utilize the calibration tool [1] for 4d radar and LiDAR. This tool jointly calculates relative transformation parameters using a specialized calibration board and reflector.
Although we use a solid-state LiDAR instead of a spinning one, we utilize the line-index channel to assess laser depth discontinuity.
Unlike Domhof et al. , who estimates the reflector position with 2D radar, we can directly obtain the z-value from our 4D radar, resulting in more accurate calibration.
Left camera - 4D radar projection (Sports Complex)
Right camera - 4D radar projection (Stream)
Spinning Radar - LiDAR Extrinsic
We employ the method used in the Boreas dataset. This method determines the rotation and the translation in the xy plane through correlative scan matching with R the Fourier-Mellin transform [2].
Specifically, we convert LiDAR point clouds into LiDAR polar images to compare with radar polar images to obtain rotation. Then, we utilize Cartesian images to derive rotation. To match the field of view(FOV) of the Aeva, we adjust the range and azimuth of the radar images.
LiDAR - spinning radar extrinsic calibration pipeline
LiDAR points, 4D radar points, and spinning radar points are red, green, and blue (Mountain)
IMU- LiDAR Extrinsic
We initialize the system using the method proposed by Zhu et al. [3]. This approach was designed for the Livox LiDAR series, so it can be seamlessly applied to our solid-state Aeva LiDAR without requiring specific targets.
INS - LiDAR Extrinsic
Hand-eye calibration was performed by comparing the trajectory obtained through LiDAR SLAM with the INS trajectory.
[1] J. Domhof, J. F. Kooij, and D. M. Gavrila, “A joint extrinsic calibration tool for radar, camera and lidar,” IEEE Transactions on Intelligent Vehicles, vol. 6, no. 3, pp. 571–582, 2021.
[2] P. Checchin, F. Gérossier, C. Blanc, R. Chapuis, and L. Trassoudaine, “Radar scan matching slam using the fourier-mellin transform,” in Field and Service Robotics: Results of the 7th International Conference. Springer, 2010, pp. 151–161.
[3] F. Zhu, Y. Ren, and F. Zhang, “Robust real-time lidar-inertial initialization,” in 2022 IEEE/RSJ International Conference on
Intelligent Robots and Systems (IROS). IEEE, 2022, pp.3948–3955.