Human manipulation relies on a combination of dexterous hands, tactile sensing, and visual feedback to maintain controlled interactions with objects, even without a firm grasp, as in the case of a waiter carrying a tray. Recent robotic advances aim to replicate these skills through vision, tactile, and force sensors for both in-hand and whole-body manipulation, including multi-arm coordination. Key control strategies include object tracking, force control, internal force estimation, and non-prehensile manipulation.
Robots must modulate grasping forces to handle both rigid and deformable objects without damage. While parallel grippers dominate industrial applications due to their simplicity, recent research demonstrates that even these devices can perform complex tasks using extrinsic dexterity, for example, exploiting gravity to pivot or slide objects within the gripper.
In cluttered environments, intrinsic dexterity becomes critical. Direct grasping may be unfeasible, making non-prehensile actions like pushing or sliding more effective to reposition objects. Although more complex, such actions are part of our daily behavior and offer a broader range of manipulation strategies for robots.
Visual feedback supports object tracking and grasp planning in both prehensile and non-prehensile scenarios. However, visual sensing can be hindered by occlusions or lighting. Tactile sensing, while requiring contact, provides robust feedback. Combining both modalities can overcome their individual limitations, offering more adaptive and resilient control.
This workshop will highlight cutting-edge research in dynamic manipulation, showcasing results from the DARC consortium and the broader robotics community, with a focus on integrating force/tactile and visual feedback for dexterous robotic tasks. The workshop will cover a wide range of topics related to dynamic manipulation using artificial vision and touch. The key areas include:
In-hand manipulation strategies
Non-prehensile manipulation: pushing, sliding, and transporting
Grasp of deformable and rigid objects
Tactile sensing: contact and slip detection
Visual perception: object tracking, pose estimation, and challenges
Feedback control using tactile and visual data
Modeling, estimation, and control of multi-arm setups
Keywords: Dynamic and Non-Prehensile Manipulation; Extrinsic Dexterity; Robotic Vision; Perception for Manipulation; Manipulation of Deformable Objects; Multi-arm Manipulation.
Lorenzo Natale, Italian Institute of Technology. Title of the talk. Title of the talk: "Domain adaptation for learning tactile object features with vision-based sensors"
Alessandro Marino, Department of Electrical and Information Engineering "Maurizio Scarano", Università degli Studi di Cassino e del Lazio Meridionale. Title of the talk: "Coordinated multi-arm control for manipulation in complex settings"
Alessio Caporali, Dipartimento di Ingegneria dell'Energia Elettrica e dell'Informazione "Guglielmo Marconi", Alma Mater Studiorum Università di Bologna. Title of the talk: "Model-based Tracking of Deformable Linear Objects in Challenging Environments"
Alessandro Saccon, Mechanical Engineering Department, Eindhoven University of Technology (TU/e). Title of the talk: "The Quest for Robust Contact-Rich and Impact-Aware Manipulation: On Control Theory, Physics Simulation, Robot Learning, and Suitable Hardware"
Maria Pozzi, Dipartimento di Ingegneria dell'Informazione e Scienze Matematiche, Università di Siena. Title of the talk: "Soft manipulation: the role of mechanical intelligence in robotic grasping"
Department of Engineering, University of Campania "Luigi Vanvitelli", Italy
Department of Information Technology and Electrical Engineering, University of Naples Federico II, Italy
Department of Information Technology and Electrical Engineering, University of Naples Federico II, Italy
Department of Engineering, University of Campania "Luigi Vanvitelli", Italy
This workshop is supported by the European Union - Next-GenerationEU - Piano Nazionale di Ripresa e Resilienza (PNRR) – MISSION 4 COMPONENT 2, INVESTMENT N. 1.1, CALL PRIN 2022 under Project Code: P2022MHR5C (DARC)