Cross-modality mmWave Sensing
This project studies mmWave radar sensing and applications. It synergizes the mmWave point cloud, vision data, and motion sensor data with advanced machine learning technologies for Re-Identification, human parsing, and egocentric pose estimation. The project aims to achieve a ubiquitous perceiving and understanding of human activities for human-centred intelligence in surveillance, smart control, AR/VR and fitness tracking.
Zero-shot Sensing
This project is designed to tackle the challenges of labour intensity and privacy issues associated with the collection of millimeter-wave (mmWave) training data. We introduce an innovative zero-shot mmWave data synthesis approach that significantly reduces the need for human intervention in the data acquisition process. Drawing inspiration from the burgeoning field of AI-generated content (AIGC), our strategy employs an AI model to create a diverse range of human motions for mmWave data synthesis. This eliminates the dependency on real-world video or motion capture (MoCap) data, offering a streamlined and privacy-conscious solution for developing advanced mmWave applications.