The 11th RGMC edition will include four competition tracks that will be run in-person at the conference venue:
The In-hand Manipulation track will challenge the teams to design robust solutions that can perform reconfigurations of small-scale objects, such as moving the object according to a sequence of predefined target positions or rotating the object according to a sequence of specified orientations, while the object is already grasped by a robot hand. The goal is to facilitate the advancement of these robust solutions, including hand design, planning, control, state estimation, and learning, with a standardized task.
Primary contact: Kaiyu Hang (kaiyu.hang@rice.edu)
Useful links: Competition details, Auto-Evaluator GitHub Repository
Past edition: here
Reference benchmarking protocol:
Benchmarking In-Hand Manipulation
S. Cruciani, B. Sundaralingam, K. Hang, V. Kumar, T. Hermans, D. Kragic
IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 588-595, April 2020
https://doi.org/10.1109/LRA.2020.2964160 [arxiv]
The Picking in Clutter track will challenge the teams to design solutions that, given a set of known and unknown objects placed randomly within a clear box, will enable a robotic arm to pick and place each object into a second clear box within a maximum allowed time to fully clear the first box. Differently from the track of the last 9th RGMC where every team could decide the grasp sequence depending on their convenience, this year an additional complexity will challenge the teams since the grasping order will be predefined and the teams should respect the sequence of grasping commanded through a ROS topic/service mimicking a communication with a PLC as in industrial environments. This means that additional strategies like pushing for decluttering and looking for the target object in the clutter are required.
The goal is to progress the design of vision and manipulation systems that can handle heterogeneous and novel objects with different physical properties even in the clutter conditions of (sequential) bin-picking scenarios.
Primary contact: Salvatore D'Avella
Further details: This track is designed and Stage 3 of the Cluttered Environment Picking Benchmark (CEPB) paper reference below. For more information on this track, please refer to the CEPB webpage at http://cepbbenchmark.eu/.
Reference benchmarking protocol:
The cluttered environment picking benchmark (CEPB) for advanced warehouse automation:
evaluating the perception, planning, control, and grasping of manipulation systems
S. D'Avella, M. Bianchi, A. M. Sundaram, C. A. Avizzano, M. A. Roa, P. Tripicchio
IEEE Robotics and Automation Magazine, 2023
https://doi.org/10.1109/MRA.2023.3310861
Further details: https://corsmal.github.io/events/rgmc/icra2026
Primary contacts: Changjae Oh (c.oh@qmul.ac.uk) and Andrea Cavallaro
The Human-to-Robot Handovers track will challenge the teams to design solutions that enable a robot to estimate the physical properties of an object while handed over by a person. The robot then will need to complete the handover by safely and stably receiving the object and delivering it to a predefined location on the table. The goal is to assess the generalization capabilities of the robotic control when receiving unknown objects that are filled (or not) with unknown content, hence with a different and unknown mass and stiffness. Challenges include illumination variations, transparencies, reflective surfaces, occlusions caused both by the human and the robot, and safe grasps, while using an affordable experimental setup.
Reference benchmarking protocol:
Benchmark for Human-to-Robot Handovers of Unseen Containers with Unknown Filling
R. Sanchez-Matilla, K. Chatzilygeroudis, A. Modas, N. Ferreira Duarte, A. Xompero, P. Frossard, A. Billard, A. Cavallaro
IEEE Robotics and Automation Letters, vol. 5, no. 2, Apr. 2020
https://doi.org/10.1109/LRA.2020.2969200
Further details: (coming soon)
Primary contacts: Narcis Miguel (narcis.miguel@pal-robotics.com)
The Mobile Manipulation track will focus on conducting an assembly task with a mobile platform. We will utilize a version of the NIST taskboards and a mobile robot will be expected to identify industrial pieces, pick them up, and insert them into target locations. The goal is to assess the manipulation capabilities of the mobile robot, while managing varying camera viewpoints and pose uncertainties caused by the moving platform. In some tasks, base-arm coordination will be required to retrieve and insert the objects.
This track will leverage the prior competition organization experiences of PAL Robotics (see https://pal-robotics.com/blog/tag/robotics-competition/) along with the below benchmarking protocols and scoring methods.
Reference benchmarking protocol:
Performance measures to benchmark the grasping, manipulation, and assembly of deformable objects typical to manufacturing applications
K. Kimble, J. Albrecht, M. Zimmerman, J. Falco
Frontiers in Robotics and AI, 9, 2022
The Open Access logo is provided by art designer at PLoS, I converted a pdf into svg - Own work using: http://www.plos.org/, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=5069452