Future of AR/VR experiences-Hand Surface Tracking with Elasticity

Credit @ Meta research (Facebook)

03/01/2023

Dhananjay Kumbhar

PhD research scholar

We use our hands to operate and interact with the environment around us, and we also communicate with our hands, adopting nonverbal signals to communicate, clarify, and highlight authors' ideas. According to the authors, Our hands are suited to both these functions due to their high degree of articulation, which leads to dexterity for manipulation and high bandwidth for communicating information.

"Many of the actions that we take with our hands involve self-contact and occlusion: shaking hands, making a fist, or interlacing our fingers while thinking. This use of of our hands illustrates the importance of tracking hands through self-contact and occlusion for many applications in computer vision and graphics, but existing methods for tracking hands and faces are not designed to treat the extreme amounts of self-contact and self-occlusion exhibited by common hand gestures.

By extending recent advances in vision-based tracking and physically based animation, Breannan Smith et. al. present the algorithm. This is the first-ever algorithm capable of tracking high-fidelity hand deformations through self-contacting and self-occluding hand gestures.

It's still early stages, but this is a significant step toward the future of AR/VR experiences.

You can find out more about this exciting work at Meta (link)

URL for a PDF version of this research paper @facebok (find paper)

#AR #VR #facebook #meta #first #algorithm #deeplearning #machinelearning #neuralnetworks #nlp #robotics #ai #datascience