4th International Workshop on

Observing and Understanding Hands in Action

The fourth edition of this workshop aims at gathering researchers who work on hand detection and hand pose estimation problems and its applications. This edition will emphasize hand-object interaction and robot grasping/manipulation (*). Development of RGB-D sensors and camera miniaturization (wearable cameras, smart phones, ubiquitous computing) have opened the door to a whole new range of technologies and applications which require detecting hands and recognizing hand poses in a variety of scenarios, including AR/VR and assistive car driving. Most hand tracking data sets and papers have been focused on near-range front-on scenarios, yet there remain many challenges. The community needs to get past this and our goal is to push the boundaries of 3D hand articulation estimation/tracking, and to evaluate a "breadth of applications" including sign language recognition, desktop interaction, egocentric views, object manipulations, far range and over-the-shoulder driver footage.

Relevant topics of the workshop include:

  • Robot grasping and object manipulation
  • Imitation learning, reinforcement learning
  • Hand object interaction
  • 3D/2D hand detection/segmentation
  • 3D/2D hand pose and gesture recognition
  • 3D articulated hand tracking
  • Hand modelling and rendering
  • Hand activity recognition
  • Gesture interfaces
  • Egocentric vision systems
  • Structured prediction, regression, and other relevant theories/algorithms
  • Applications of hand pose estimation in AR/VR
  • Applications of hand pose estimation in robotics and haptics
  • Driver hand activity analysis

(*) To those interested in the "object" and "manipulation" parts, we recommend attending the R6D workshop, which will take place in the same room as our workshop but during the morning session.

Previous workshop editions: HANDS 2015, HANDS 2016, HANDS 2017.

Workshop sponsored by: