This project shows how robots can be controlled using natural language. A Large Language Model (LLM) understands simple user commands and converts them into robot actions. We demonstrate this with a drone and a quadruped robot performing basic movements and tasks from high-level instructions. The goal is to make human–robot interaction more intuitive and accessible.
UAV–UGV collaborative robot team deployed in real outdoor environments. The system demonstrates autonomous UGV navigation, waypoint tracking, and autonomous UAV landing. It emphasizes real-world deployment and validation of multi-robot coordination.
This project develops an energy-aware UAV–UGV collaboration framework. UAVs handle aerial tasks while UGVs provide mobile recharging support. Intelligent route and rendezvous planning extends mission time and improves efficiency, with applications in delivery, logistics, and surveillance.
This project presents a UAV–quadruped collaborative system for automated industrial inspection. The UAV creates navigation maps through aerial mapping, and the quadruped robot uses those maps to plan paths and perform inspection or manipulation tasks autonomously.
This project developed an image processing-based multilayered algorithm using OpenCV for real-time fire detection. An Arduino-powered autonomous fire suppression system was engineered to integrate seamlessly with the fire detection algorithm. The system's robustness and reliability were validated through a lab-scale prototype demonstration, showcasing its potential for effective fire mitigation.