Resources

Technical Talks:

Slides from my talk in the Inaugural Multimedia Frontiers Workshop at ACM MM '15 are here.

Databases

My research has contributed to the following databases:

1. The NUSEF eye-tracking database for saliency analysis in images. (contains eye movement recordings for 758 images, with at least 13 user recordings per image).

2. The Dynamic Headpose (DPOSE) dataset for evaluating head pose algorithms on mobile subjects (over 50K images capturing static and moving subjects using four synchronous cameras).

3. An eye-tracking database compiled from 24 subjects (6 males, 18 females) viewing 30 affective scenes from 10 movies as described in our JoV paper. Systematic differences in eye movement patterns were observed for neutral vs emotional scenes. Also, significant differences were observed between eye movements for movie scenes vs static key-frames extracted from these movie scenes. Kindly cite the Journal of Vision paper if you use these data. Description of the movie scenes is provided in the paper, while key-frames used in the study can be obtained by emailing me.

3. The DECAF database comprising physiological (Magnetoencephalogram + ECG, EMG and EOG) responses of 30 subjects in response to affective video stimuli (coming soon).

Code (coming soon)