Head Tracker Demo
Home -> 3D Head Tracker Demo
This is the demo of the head tracker based on 3D cylinder head model. The software is intended to be the input module for an automated head animation system.
The base for implementation is OpenCV computer vision library. This head tracker uses the approach proposed by J. Xiao et al. for estimating head position on each frame. It uses cylindric head model and forwards compositional image alignment algorithm. On CPU Pentium IV 3.0 GHz it achieves average speed of 15 fps, which is almost acceptable for real-time applications.
When user looks directly to camera, his/her face is detected with Viola-Jones face detector implemented in OpenCV. Based on detected face rectangle head model is initialized and head is tracked automatically on the subsequent frames.
When user moves head too quickly or when lighting conditions change, tracking may fail. In such case user can click the video window to force model initialization.
In the screen shot above you can see an OpenGL-based avatar. The avatar is a simple cylinder that tries to copy movements of the head.
The estimated position of head on each video frame includes 3 coordinates of head center and 3 rotation angles around X, Y and Z axis also named pitch, yaw and roll, respectively. Also projection and model-view matrices are generated. These matrices are used to render the avatar.
Here is the list of people who have participated in creation of the demo:
Oleg A. Krivtsov - research and implementation
Vladimir S. Poponin - mathematics consultant
Anton A. Tushmintsev, Ilia V. Bezhodarnov - project overseers
Also thanks to Pavel D. Novodon for his constructive critics.
System requirements: Pentium III or better, 256 Mb memory, web-camera installed, Windows XP or later.
Note: This program was tested on Windows XP SP1 only.
Download and install the demo to your computer. After installation, open Start Menu and choose All Programs | Tracker Demo. Then click either Launch Demo (web-cam, manual init) shortcut or Launch Demo (synthetic case, manual init). In the first case, it will use your web camera and you will be able to evaluate the tracker on your own head. In the second case it will use synthetic video sequence.
In both cases, you can click the "video" window with your mouse to reinitialize the tracker. Note that when reinitializing you should look directly to camera, otherwise incorrect initialization may be.
Pressing ESC button allows to exit the demo.
To uninstall the demo, open Start Menu and choose All Programs | Tracker Demo | Uninstall.
If you don't have a web-camera installed, you can make the demo to track head on an AVI video clip. For example, you can use some of short clips available at ftp://csr.bu.edu/headtracking/. Here is an example video showing how the demo works on one of those clips.
Go to C:\Program Files\Tracker Demo\bin directory and type in the command line: demo -avi <file_name> , where <file_name> should be the name of your AVI clip.
For information on using the demo from command line, type demo -?.
An example of the software that can do similar (or even better) things is WATSON head tracking library by Louis-Philippe Morency. It has support of different operating systems, GLUT-based user interface, can work with USB and stereo cameras, has many configurable parameters. When I ran Watson on my machine, it worked a little slower than our demo, but that's because it uses various techniques to increase robustness of tracking. And Louis-Philippe claims that Watson works at almost 25 fps when working with stereo camera.
Nice software, I like it.
Click here to see the list of publications related to this software.
If you are looking for tutorials on object tracking algorithms and sample source code, you might also want to see these Tutorials.