EventSleep: Sleep Activity Recognition
with Event Cameras
Carlos Plou, Nerea Gallego, Alberto Sabater, Eduardo Montijano, Pablo Urcola, Luis Montesano, Rubén Martinez-Cantin, Ana Murillo
RoPert, DIIS - I3A, University of Zaragoza Bitbrain Technologies, Zaragoza, Spain
Project Context
DeepSleep is a project in collaboration between the RoPert research lab from the University of Zaragoza, and BitBrain, a neurotechnology company that combines neuroscience, artificial intelligence, and hardware to help research, tech and health professionals to leverage neuroscience in a practical and reliable way.
As part of a broader study centered on sleep, this project investigates the suitability of event cameras, to analyze in a non-invasive manner specific behaviors that occur while sleeping and lead to sleeping disorders.
Introduction
Event cameras are a promising technology for activity recognition in dark environments due to their unique properties. However, real event camera datasets under low-lighting conditions are still scarce, which also limits the number of approaches to solve these kind of problems, hindering the potential of this technology in many applications. We present EventSleep, a new dataset and methodology to address this gap and study the suitability of event cameras for a very relevant medical application: sleep monitoring for sleep disorders analysis.
Dataset Content
The EventSleep dataset is composed of video sequences of common sleeping movements, recorded with an infra-red and a event camera. The use of a realistic recording scenario makes EventSleep the first event-based dataset of activity recognition that includes event sequences recorded under darkness. EventSleep dataset is already available at Synapse, however infrared data access requires a formal request through email. Further details can be found in the Wiki Synapse page (Section: To request Infrared data access).
The proposed dataset includes video sequences of14 different participants, 3 different set up configurations that include different ligthing conditions and occlusions, and 10 different fine-grained classes.
Cameras
Trials were recorded by two different cameras: Event (DVXplorer camera, 640 x 480 resolution) and Infrared (ELP HD Digital Camera, 6fps). Both cameras were placed together, attached to a metal bar, facing down and located 2 meters above the center of the bed, ensuring a full observation of the participants movements
Scene Configurations
EventSleep was recorded in a room that mimics a regular room with a double-size bed. Each participant was recorded under 3 settings with varying conditions:
Config1: The subject is covered by an eiderdown under full darkness.
Config2: The subject is covered by an eiderdown under partial darkness.
Config3: The subject is uncovered under full darkness.
The full darkness (≤0.1Lux) setup consists of a bedroom with all lights off, door and window closed, with dense blinds covering the window. The partial darkness (0.2Lux) setup consists of a small night lamp on, placed on the floor far away from the head of the subject.
Activity Labels
All subjects were instructed to execute a certain set of movements and positions during each trial. These movements were carefully selected based on their potential relevance to medical research as part of a larger biomedical study project.
Within this context, experts in the field identified specific actions that appear in normal sleep (e.g. transitions between sleep positions) and in sleep disorders such as restless legs syndrome or periodic limb movement (e.g. shaking the legs). Since many sleep studies require the recording of physiological information, we also considered positions that could impact the performance of other sensors such as a headband for polysomnography. The final set of 10 activity labels considered is: