This demonstration showcases real-world experiments in autonomous robot navigation using GPS and sensor fusion techniques. The robots are guided to visit a set of task points outdoors without relying on pre-mapped environments.
The system integrates GPS data with IMU and odometry inputs through sensor fusion algorithms, enabling the robot to accurately localize itself and follow planned paths. In certain scenarios, SLAM (Simultaneous Localization and Mapping) modules are used for additional precision in semi-structured environments. The resulting setup supports real-time autonomous mission execution, even in GPS-denied zones when supported by SLAM.
Key capabilities include:
GPS-based waypoint navigation
Real-time localization using sensor fusion
Path planning and task point coverage
SLAM integration for map-building
Demonstrating autonomous UAV flight with waypoint navigation and precision landing in real-world environments, leveraging computer vision, ArUco markers, and MAVSDK. These experiments highlight robust integration of perception and control for reliable autonomous operations.