Challenges


Abstract

We present the HANDS23 challenge for hand pose estimation based on AssemblyHands and ARCTIC. In line with this year’s theme, the challenge consists of two tasks and focuses on hand pose estimation on occlusion and interaction scenarios.

Our first task focuses on egocentric 3D hand pose estimation from a single-view image, which has been less explored due to the limited benchmarks for this task but has great potential and attention for enabling AR/VR applications. We use the recently introduced AssemblyHands. It provides mutli-view captured videos of hand-object interaction during assembling and disassembling of toy-vehicles. In particular, it provides both static and egocentric recordings, and auxiliary cues like action, object, or context information for hand pose estimation. 

Our second task focuses on consistent motion reconstruction based on ARCTIC. It is a dataset of hands dexterously manipulating articulated objects and contains videos from 8x 3rd-person views and 1x egocentric view. Besides accurate ground-truth 3D hand and object meshes, it also provides detailed contact information between the hands and objects during manipulation.

Winners and prizes will be announced and awarded during the workshop.

Please visit the challenge pages for more details.

Challenges Page

Participation 

We follow the rules of previous challenges and more details can be found in the challenge page.