We prepared an offline Blender demo of our project. We setup 5 randomly placed cameras in a scene, and placed a 3D human model that walks on a path. We calibrated the cameras using our multi-camera calibration module. After that, we took 250 frames from 5 cameras, with the human subject walking around the scene. Then we fed these frames into our 2D human detection module and produced 2D keypoints. Using corresponding 2D keypoints in our multi-camera calibration module, we were able to track the 3D motion of the subject.