The 10th RGMC edition will include five competition tracks: one simulation-only track and four tracks that will be run in-person at the conference venue. The four on-site competition tracks are: Manufacturing Track, Human-to-Robot Handover Track, In-Hand Manipulation Track, and Picking in Clutter Track.
Are you a student or young researcher eager to explore the world of robotic manipulation?
The Simulation-Sorting Manipulation Track at ICRA offers an exciting opportunity to dive into this field. Designed for undergraduate students and junior researchers, this track builds on the RoboCup ARM challenge, allowing participants to develop perception and control algorithms in a shared simulation environment. The task involves using computer vision and a robotic arm with a 2-fingers gripper to sort objects placed in increasingly complex positions and orientations while overcoming challenges like object occlusions and spatial limitations. Gain hands-on experience, tackle real-world robotics challenges, and connect with peers and industry experts through this track.
MATLAB/Simulink licenses are provided to all participants of IEEE-RAS Competitions
Participants will receive:
Certificate from ICRA 2025 and MathWorks in robot perception and manipulation expertise
Opportunity to showcase solutions at ICRA and in MATLAB Community
MathWorks giveaway packages for top teams
Primary contact: roboticsarena@mathworks.com
Further details: Simulation Sorting Track Webpage.
The In-hand Manipulation track will challenge the teams to design robust solutions that can perform reconfigurations of small-scale objects, such as moving the object according to a sequence of predefined target positions or rotating the object according to a sequence of specified orientations, while the object is already grasped by a robot hand. The goal is to facilitate the advancement of these robust solutions, including hand design, planning, control, state estimation, and learning, with a standardized task.
Primary contact: Kaiyu Hang (kaiyu.hang@rice.edu)
Useful links: Competition details, Auto-Evaluator GitHub Repository
Past edition: here
Reference benchmarking protocol:
Benchmarking In-Hand Manipulation
S. Cruciani, B. Sundaralingam, K. Hang, V. Kumar, T. Hermans, D. Kragic
IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 588-595, April 2020
https://doi.org/10.1109/LRA.2020.2964160 [arxiv]
The Picking in Clutter track will challenge the teams to design solutions that, given a set of known and unknown objects placed randomly within a clear box, will enable a robotic arm to pick and place each object into a second clear box within a maximum allowed time to fully clear the first box. Differently from the track of the last 9th RGMC where every team could decide the grasp sequence depending on their convenience, this year an additional complexity will challenge the teams since the grasping order will be predefined and the teams should respect the sequence of grasping commanded through a ROS topic/service mimicking a communication with a PLC as in industrial environments. This means that additional strategies like pushing for decluttering and looking for the target object in the clutter are required.
The goal is to progress the design of vision and manipulation systems that can handle heterogeneous and novel objects with different physical properties even in the clutter conditions of (sequential) bin-picking scenarios.
Primary contact: Salvatore D'Avella
Further details: This track is designed and Stage 3 of the Cluttered Environment Picking Benchmark (CEPB) paper reference below. For more information on this track, please refer to the CEPB webpage at http://cepbbenchmark.eu/.
Reference benchmarking protocol:
The cluttered environment picking benchmark (CEPB) for advanced warehouse automation:
evaluating the perception, planning, control, and grasping of manipulation systems
S. D'Avella, M. Bianchi, A. M. Sundaram, C. A. Avizzano, M. A. Roa, P. Tripicchio
IEEE Robotics and Automation Magazine, 2023
https://doi.org/10.1109/MRA.2023.3310861
Further details: https://corsmal.github.io/rgmc2025-handover-track/
Primary contacts: Changjae Oh (c.oh@qmul.ac.uk) and Andrea Cavallaro
The Human-to-Robot Handovers track will challenge the teams to design solutions that enable a robot to estimate the physical properties of an object while handed over by a person. The robot then will need to complete the handover by safely and stably receiving the object and delivering it to a predefined location on the table. The goal is to assess the generalization capabilities of the robotic control when receiving unknown objects that are filled (or not) with unknown content, hence with a different and unknown mass and stiffness. Challenges include illumination variations, transparencies, reflective surfaces, occlusions caused both by the human and the robot, and safe grasps, while using an affordable experimental setup.
Past edition
Reference benchmarking protocol:
Benchmark for Human-to-Robot Handovers of Unseen Containers with Unknown Filling
R. Sanchez-Matilla, K. Chatzilygeroudis, A. Modas, N. Ferreira Duarte, A. Xompero, P. Frossard, A. Billard, A. Cavallaro
IEEE Robotics and Automation Letters, vol. 5, no. 2, Apr. 2020
https://doi.org/10.1109/LRA.2020.2969200
The Open Access logo is provided by art designer at PLoS, I converted a pdf into svg - Own work using: http://www.plos.org/, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=5069452