OpenPose
About OpenPose
OpenPose is an open-source, realtime system for multi-person 2D pose detection, including body, foot, hand, and facial keypoints.
Hand pose estimation is of interest in our HUGS research. OpenPose identifies multiple points on the human skeleton, including hand landmarks. It was developed on adult models, and we have noticed in the little exploration we have done this year that OpenPose doesn't recognize infant hands in supine (palm up) or when only finger tips are visible. This latter case happens when the baby's hand is visualized from the front when grasping the HUGS toy bar.
Below is the 2019 article by Cao et al. describing the application.
OpenPose Application to General Infant Pose Estimation
OpenPose has been used for General Movements Assessment in infants. This application involves the infant's whole body, videoed from above with the infant supine. It does not involve hand movements as the GMA itself doesn't focus specifically on them as does the HAI.
See the article by Reich et al. 2021 below for a description of how they used OpenPose to classify infants as typically developing or showing signs of cerebral palsy using the criteria of the GMA.
Basic Understanding of Infant Neurological Assessment
You should have a general familiarity with the GMA, as well as the HAI. This background will be part of the introduction to your research paper.
OpenPose Code Demo
You can find detailed information on how to install and run the OpenPose Demo Jupyter Notebook in Google Colab on Github here.
It's fairly straightforward to pull a demo notebook into your own Colab folder in your HUGS-LAB account and then to change the demo code to process videos of interest.
Here is a Jupyter notebook with additional comments (beyond the ones provided by the OpenPose team) relevant to infant video analysis.
It's set up to step you through modifying the demo code to process the videos you want to explore. All you have to do is copy it to your personal Colab folder and set up the required file paths.
This particular notebook creates labeled video in OpenPose using an approximate 12-minute video designed for HAI (Hand Assessment for Infants) assessor training.
Other Key OpenPose GitHub Links
Here is the original Github link for Openpose.
https://github.com/CMU-Perceptual-Computing-Lab/openpose
This is the latest version of OpenPose Tan is thinking to use if possible: 1.7.0 (Nov 17, 2020).
No longer part of the plan for the stroke functional movement study.
https://cmu-perceptual-computing-lab.github.io/openpose/web/html/doc/index.html
Hand Pose Estimation VERY Similar to What We Are Doing With HUGS Babies
https://docs.google.com/document/d/1N9XkJiQl5WWReke_uAPDl7Fz3miMVjsXgTC7FEhSjKY/edit?usp=sharing
Notice the problems the study team encountered tracking hand supination/pronation and opening/closing. Good to be aware of these issues as we choose training data. See annotation page 12.