I was involved in several Simultaneous Localization and Mapping (SLAM) and Multi-Object tracking projects since 2009. Following is a set of videos showing some of my recent works. See Photos for some of the photos taken during this time.
Random Finite Set (RFS) based 3D Simultaneous Localization and Mapping (SLAM) algorithm at Nanyang Technological University (NTU) using a handheld ZED Stereo camera connected to a laptop. The robot trajectory (show in white) is estimated using a particle filter, and the map (shown in red) is estimated using a Gaussian Mixture implementation of a PHD filter. The inputs to the SLAM algorithm are the triangulated 3D feature positions (shown in yellow) and 3D Visual odometry. The 3D map of the environment is built using the estimated robot pose and triangulated point cloud from the stereo matching process and visualized using ROS (https://www.ros.org) octomap package.
Random Finite Set (RFS) based 3D Simultaneous Localization and Mapping (SLAM) algorithm at Nanyang Technological University (NTU) using a handheld ZED Stereo camera connected to a laptop. The robot trajectory (show in white) is estimated using a particle filter, and the map (shown in red) is estimated using a Gaussian Mixture implementation of a PHD filter. The inputs to the SLAM algorithm are the triangulated 3D feature positions (shown in yellow) and 3D Visual odometry. The 3D map of the environment is built using the estimated robot pose and triangulated point cloud from the stereo matching process and visualized using ROS (https://www.ros.org) octomap package.
Clearpath Husky robot fitted with a ZED Stereo camera performing Random Finite Set (RFS) based 3D Simultaneous Localization and Mapping (SLAM) at Nanyang Technological University (NTU). The robot trajectory is estimated using a particle filter, and the map is estimated using a Gaussian Mixture implementation of a PHD filter. The inputs to the SLAM algorithm are the triangulated 3D feature positions and 3D Visual odometry. The 3D map of the environment is built using the estimated robot pose and triangulated point cloud from the stereo matching process. The programs were written in ROS environment in Ubuntu and visualized in ROS rviz package.
Clearpath Husky robot fitted with a ZED Stereo camera performing Random Finite Set (RFS) based 3D Simultaneous Localization and Mapping (SLAM) at Nanyang Technological University (NTU). The robot trajectory is estimated using a particle filter, and the map is estimated using a Gaussian Mixture implementation of a PHD filter. The inputs to the SLAM algorithm are the triangulated 3D feature positions and 3D Visual odometry. The 3D map of the environment is built using the estimated robot pose and triangulated point cloud from the stereo matching process. The programs were written in ROS environment in Ubuntu and visualized in ROS rviz package.
Clearpath Husky robots fitted with a Velodyne VLP-16 performing Random Finite Set (RFS) based Single robot and Multi-robot Simultaneous Localization and Mapping (SLAM and Collaborative SLAM) at Nanyang Technological University (NTU).
The point cloud of the environment captured from VLP-16 is pre-processed to remove the ground plane, and then bounding box features are extracted. The centroids of those bounding boxes are used as features in this SLAM implementation (3D features with 2D SLAM). Odometry is calculated by fusing wheel encoder and a gyroscope data.
The robot trajectory is estimated using a particle filter, and the map is estimated using a Gaussian Mixture implementation of a PHD filter in both SLAM and CSLAM. Note that CSLAM was a Centralized implementation and could only run offline in implementation in the ROS environment.
Testing of real-time Random Finite Set (RFS) based 3D Simultaneous Localization and Mapping (SLAM) algorithm at Nanyang Technological University (NTU) using data from a ZED Stereo camera mounted on a Asctec Pelican UAV. Note that the UAV is hand carried instead of flying inside the laboratory.
The robot trajectory (show in white) is estimated using a particle filter, and the map (shown in red) is estimated using a Gaussian Mixture implementation of a PHD filter. The inputs to the SLAM algorithm are the triangulated 3D feature positions (shown in yellow) and 3D Visual odometry. The 3D map of the environment is built using the estimated robot pose and triangulated point cloud from the stereo matching process and visualized using ROS (https://www.ros.org) octomap package in rviz.
Testing of real-time Random Finite Set (RFS) based 3D Simultaneous Localization and Mapping (SLAM) algorithm near Hall 7 at Nanyang Technological University (NTU) using data from a ZED Stereo camera mounted on a Asctec Pelican UAV.
The robot trajectory (show in white) is estimated using a particle filter, and the map (shown in red) is estimated using a Gaussian Mixture implementation of a PHD filter. The inputs to the SLAM algorithm are the triangulated 3D feature positions (shown in yellow) and 3D Visual odometry. The 3D map of the environment is built using the estimated robot pose and triangulated point cloud from the stereo matching process and visualized using ROS (https://www.ros.org) octomap package in rviz.
Performing Random Finite Set (RFS) based Collaborative Localization and Mapping with Moving Object Tracking (CSLAMMOT). Two Videre erratic mobile robot platforms fitted with 2D Sick Lidar sensors and wheel encoders are used. The collected data were processed offline and fed to a Matlab implementation of the algorithm (results have shown towards the end of the video). My friends with buckets acted as the moving objects, and pillars are the static features. The feature extraction algorithm doesn't provide information about the feature type, and the CSLAMMOT algorithm distinguishes between static and dynamic features depending on the target (feature) behavior. Robot trajectories are estimated using a particle filter, and the map is estimated using a Gaussian Mixture implementation of a PHD filter.
Here we are trying to collect a dataset for verifying our SLAM algorithms at Pandan Reservoir in Singapore. We attached metal rods to the buoys using ropes and dropped into the reservoir, so that they stay steady at the same place. The kayaks are fitted with computers, doppler velocity logs (DVL) and 2D Lidar sensors.