Human Activity Recognition

One of the underlying foundation of smart applications is understanding a significant portion of the subject's characteristics. The encompassing name for that is: human activity recognition (HAR). In general, HAR aims at automating the following:

Generally, inertial data, streamed through inertial measurement units (IMUs), are currently more commonplace due to the following:

  • IMUs are currently embedded on almost all commodity devices, mobile and wearable, such as smartphones, smartwatches, etc.

  • Low cost.

  • It is more robust with respect to security and privacy, as compared for example, with streaming videos.

  • As embedded in wearables and mobile devices, such data modality is continually active and operational. Compare with, for example, surveillance cameras, which have limited spatial scope of operation, in addition to being contingent on proper lighting conditions.

The main disadvantage of inertial motion data is that such data have low semantical content in contrast to visual data.

Generally, recognition of human activities has been a long-running research domain, which has received increasing attention over the past few years. HAR systems aim at determining the ongoing activities of a person, a group of persons, or even the crowd based on sensory observation data, as well as some knowledge about the context within which the observed activities take place. In many cases, an activity is required to be recognized regardless of the environment in which it is performed or the performing person. HAR systems can be classified based on the type of sensory information used, as the kind of sensory data greatly affects the kinds of features, algorithms, and modeling frameworks and architectures used for analysis. Generally, we can identify the following streams of research and developments in HAR systems: (1) HAR systems based on visual data, (2) HAR systems based on motion sensors such as IMU (Inertial Measurement Unit), and (3) HAR systems based on the received signal strength RSS from commodity routers installed in the surrounding environment.

The overall objectives of our lab's research into this area is to build a robust and accurate human activity systems based on IMU sensors onboard wearable devices. We conjecture that the accelerometer and Gyroscope rotation velocity signals are complete in the sense that these signals can uniquely (up to a certain small error margin) identify the vast majority of activities that are of interest. One goal of our work is to verify this claim, or at least identify the activities for which this claim holds. Wearable devices include smartwatches, smartphones, wrist bands, etc. Any such developed HAR system must achieve the following characteristics: (1) it is accurate (enough) to be effective and usable, (2) computationally feasible as an online version will run on a mobile personal device (such as a smartphone), (3) it is robust in the sense that its performance (predictive accuracy, etc.) remains fixed (up to a certain error margin), and invariant to changes in the hardware and software characteristics of the sensing wearable device as well as to the spatial location and other variations; in other words the system should be calibration-free, (4) it is able to effectively and continuously in real-time recognize the current activity of the user in addition to the switching points between different activities, (5) be able to do an offline statistical analysis of the user’s activity pattern(s) over some extended period of time, (6) be able to offline detect abnormality of change in the user’s typical behavioral pattern(s) overall and in specific activities, (7) be able to detect an online anomaly such as falling, and (7) detect and recognize joint activities among the users and one or more companion.