Research‎ > ‎

Modeling Users' behavior

Projects:

1. SenseMe: A system for multi-sensory and multi-dimensional context and activity recognition


Abstract:

In order to make context-aware systems more effective and provide timely, personalized and relevant information to a user, the context or situation of the user must be clearly defined along several dimensions. To this end, the system needs to simultaneously recognize multiple dimensions of the user’s situation such as location, physical activity etc. in an automated and unobtrusive manner. In this paper, we present SenseMe - a system that leverages a user’s smartphone and its multiple sensors in order to perform continuous, on-device, and multi-dimensional context and activity recognition. It recognizes five dimensions of a user’s situation in a robust, automated, scalable, power efficient and non-invasive manner to paint a context- rich picture of the user. We evaluate SenseMe against several metrics with the aid of 2 two-week long live deployments
involving 15 participants. We demonstrate improved or comparable accuracy with respect to existing systems without requiring any user calibration or input.

Related publications:

a. Preeti Bhargava, Nick Gramsky, Ashok Agrawala, SenseMe: A System for Continuous, On-Device, and Multi-dimensional Context and Activity RecognitionProceedings of the 11th International Conference on Mobile and Ubiquitous Systems: Computing, Networking and Services (MobiQuitous 2014) [PDF]
b. Preeti Bhargava, Nick Gramsky, Ashok AgrawalaTo Sense or not to Sense: An Exploratory Study of Privacy, Trust and other related concerns in Personal Sensing Applications, EAI Endorsed Transactions on Context-aware Systems and Applications, 2016 [PDF]

2. Modeling Users’ Behavior from Large Scale Smartphone Data Collection


Abstract:

A large volume of research in mobile and ubiquitous systems has been devoted to using data sensed from users’ smartphones for inferring their current high level context and activities. However, mining of users’ diverse longitudinal behavioral patterns from this data, which can enable exciting new context-aware applications and services, has not received much attention. In this paper, we focus on learning diverse patterns from large-scale data collected from users’ smartphones. We utilize these patterns to help identify a variety of users’ behaviors, habits, and daily life places and activities. To this end, we develop a unified infrastructure and implement several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls and inferring
their device charging behavior. We evaluate our work on large-scale real-world smartphone data of 200 users, from the DeviceAnalyzer dataset, consisting of 365 million data points. We show that our algorithms and approaches can model user behavior with high accuracy and outperform existing approaches.

Related publications:

a. Preeti Bhargava, Ashok AgrawalaModeling Users' Behavior from Large Scale Smartphone Data Collection, EAI Endorsed Transactions on Context-aware Systems and Applications, 2016 [PDF]