Human Robot Interaction

1) Personality Emulation

This video shows two versions of robot behavior: "conscientious" and "neurotic". In a simple dialogue, the reaction of the robot is clearly different. Notice mainly on the variation of the synthesized vocal expression selected by the system.


2) Facial Expression Synthesis

Retro-projected Mask

This DEMO shows the synthesis of emotions running over our prototype robotic head --- with retro-projected mask.



3) Facial Expression Analysis/Classification

This DEMO shows in details the feature extraction of the Action Units from the image and the Bayesian facial expression Classification implemented and running.

Suggestion: Watch this movie in full-screen. Due to it's size, this movie can take some minutes to load.


4) Lips Synchronization test with random phrases

This DEMO shows an avatar performing audio and visual synthesis at the same time with lips syncronization. Audio and Face emotions are set to neutral.


5) Facial Expression Analysis and Imitation.

With a simple avatar to synthesize the facial expressions on the screen, this Demo shows the Facial Expression Bayesian Classification working. In this DEMO the output of the Facial Expression Bayesian Classifier is directly connected to the reverse Bayesian Network which does the synthesis process. Leading to imitation.



6) Vocal Expression Analysis

Setting SoPHIE Angry by speaking loud with it (turn sound on).

Without Lips Syncronization, in this DEMO SoPHIE just synthesize the facial expression at the end,
when it concludes that it should become angry.


 Other Videos:

Horopter Segmentation and Automatic Tracking of ROI (camera was mounted in a pan-tilt unit)

HoropterAndTracker - Copy.mpg

IMPEP and Virtual Reality Helmet