This project focuses on the development of an omnidirectional mobile robot, with the ultimate goal of achieving full autonomous navigation. My primary responsibility was designing and implementing the autonomous navigation system using ROS 2.
Specifically, my contributions included:
ROS 2 Control: Developing the hardware interface for the robot and implementing the omnidirectional motion controller.
SLAM: Integrating data from two LiDAR sensors and merging them to perform Simultaneous Localization and Mapping.
Nav2: Configuring the Nav2 stack to enable autonomous navigation for the robot.
Docker: Utilizing Docker to run ROS 2 Humble on the Jetson Nano, which does not natively support Ubuntu 22.04.
Digital Twins (DT) play a crucial role in Warehouse optimization, and Digital Shadows represent a significant milestone in the journey toward implementing full functional DT. This project focuses on creating the Digital Shadow of the UR5e robotic arm. The initial step involved ensruing that the simulated robot in Gazebo accurately mimics the movements of the real UR5e using ROS 2.
My contributions to this project included:
Gazebo Simulation and Moveit : Spawn the UR5e robot model on a table and Configuring Moveit to account for the table during motion planning, ensuring collision-free trajectories.
Joint Trajectory Controller: Developing a node that captures the real UR5e's joint movements and uses this data to control the simulated robot in Gazebo.
Namespacing: Make sure each robot (real and simulated) operates within its own namespace, avoiding conflicts in tf frames and controllers
Warehouses often utilize a variety of robots, such as mobile robots and robotic arms, to automate tasks. To achieve Digital Twins for such environments, it is essential to simulate a heterogeneous fleet of robots seamlessly in Gazebo. This project focuses on simulating different types of robots - specifically the MiR100, UR5e, and Techman robots - in Gazebo using ROS1.
My contribitions included:
Gazebo Simulations: Wriite a launch file to spawn multiple robots (MiR100, UR5e, and Techman) into the same Gazebo world, leveraging existing robot-specific packages.
Namespacing: Prevent conflicts between controllers, TF frames using namespace therefore ensuring smooth operation of all robots in the same simulation environment.
This project was the focus of my internship, which I completed as part of my master's degree. It originated from the observation that operators often waste significant time fetching raw materials. The goal was to design a multi-robot system to assist operators by retrieving materials and delivering them on demand. The added value of this system was that operators could command the robots using vision-based gestures.
My Contributions included:
Gesture Recognition: Utilized a Jetson Nano and Jetson Inference package to develop a vision-based gesture recognition system for intuitive robot control.
Queueing System: Implemented a queueing system to store operator commands when all robots were busy and assign tasks to robots as soon as they became available.
GUI Development: Designed a graphical user interface (GUI) to display the current mission status and identify which robot was executing it.
Multi-Robot System Setup: Configured and deployed a multi-robot system using two real Turtlebot3 robots.
NB: This project marked my first deep immersion into the ROS ecosystem, where I gained hands-on experience with topics, services, and actions concepts. The success of this work led to a publication at the FAIM 2023 international conference in Portugal.
This internship was a pivotal experience, combining robotics, software development, and human-robot interaction to solve real-world challenges.