The projects presents an image-based 3D human joint angle estimation system that was trained on accurate 3D joint angles for our daily activities. We built a new 3D joint angle dataset that provides videos captured by 28 digital cameras and accurate annotations of 3D joint angles and 3D joint positions extracted from a motion capture system. We assume that the system is provided just a single image from the frontal view or the side view, then is asked to estimate the 3D angles of the lower limb joints.
The human joint angle estimation system can be applied in many applications such as assisting elderly daily walking, determining the progress during the rehabilitation of patients, improving athletes’ performance, enhancing human health, developing the sports gear equipment, and detecting early human joint problems.
In summary, each subject had to act all actions in 22 minutes. This means each camera will capture 22 minutes of videos with 60 frames per second for each person. Totally, the dataset includes 7,180,800 frames, and it is stored on SSD disks with storage of 8.131TB. Provided annotations in the dataset will include 3D joint positions in the global coordinate and in monocular coordinates, 3D joint angles, 3D joint location on images, and human bounding boxes on every image