Project

Understanding human behavior from gaze in daily human-computer interaction

  • Description: this project studies the implication and connection between eye gaze and interactions based on off-the-shelf webcam.
  • Responsibility: investigate an innovative and non-intrusive approach to collecting and identifying validated data from daily human-computer interaction for eye gaze estimation, and investigate the gaze-hand coordination and its association with affect.

MelodicBrush

  • Description: this project studies the user creativity and enjoyability of a cross-modal digit art system that links calligraphy writing and music generation.
  • Responsibility: work with Will Tang on the system design, implement a vision system to sense the precise brush touching trajectory and pressure from a depth information, and reconstruct the graphical ink effect on the display in real-time.

Modeling facial affect with minimal human annotation

  • Description: this project studies user-dependent learning of facial affect. It generates fine-grained annotations from coarse-grained annotations for weakly supervised learning.
  • Responsibility: propose multiple-instance learning and ensemble learning techniques to support personalized learning with minimal human annotation effort, conduct the human subject experimental study on facial response to video stimuli.

A multimodal approach to high-level mental state detection

  • Description: this project investigates a multimodal approach to attention detection in the context of human-computer interaction.
  • Responsibility: work with Hugo Sun on attention detection based on signals from webcam, mouse, and keyboard, during subjects are reading, typing and searching.

Emotar: communicating feeling through video sharing

  • Description: this project creates a social video-sharing system with emotion augmentation, and studies its impact on sharing experience, social bonding, and sense of togetherness among friends.
  • Responsibility: work with Tiffany Kwok on the design of an asynchronous sharing platform, and use facial affect recognition technique for emotion awareness enhancement.

Physiological mouse

  • Description: this project explores the feasibility of using a physiology-aware mouse to facilitate the detection of user affect.
  • Responsibility: work with Eugene Fu on a mouse that detects holder’s heartbeat and respiration, processes signals in the frequency domain, and visualizes them in real-time.

Mobile DJ: a tangible, mobile platform for active and collaborative music listening

  • Description: this project creates a tangible, mobile platform for active music listening to augment the effect of social interaction.
  • Responsibility: work with Kin Lau on the design of hardware interface.

Response-based interactive motion generation

  • Description: this project explores the simplified joint model to simulate physical responses.
  • Responsibility: study the simplified dynamics model for the motion synthesis of the animated character, and implement a prototype system for validation.