Human Activity Recognition (HAR) is increasingly being applied in industrial settings to enhance the efficiency of manual labour. Processes such as assembly and packaging in factories or logistics centres still largely rely on human workers, and this dependency is expected to persist in meeting dynamic demand shifts. Consequently, accurately quantifying manual labour is essential for streamlining these operations.
In logistics centres, workers perform sequential tasks during packaging, including reading item labels and assembling boxes. Gaining insights into each step—such as its duration and identifying any irregularities—is vital for optimizing the packaging workflow. However, variations in task details, such as differences in item size or the number of items, present significant challenges in accurately analyzing these operations.
Using large datasets to train complex models is often challenging in real-world scenarios. Therefore, this challenge focuses on achieving Human Activity Recognition with limited data and computational resources.
Key Objective: Develop a virtual data generative method to improve Human Activity Recognition (HAR) using the OpenPack Dataset.
We will provide you with the raw sensor data and the HAR network. Your task is to design an algorithm to generate virtual data from the raw data (OpenPack, Scenario 1, both wrists) to enhance the performance of the HAR model.
You can follow the example code we provide to implement and test your idea effectively from GitHub.
For submission, participants could upload their zip file (virtual folder and custom_functions.py) to the following link: TBD