EPIC-Tent: An Egocentric Video Dataset for Camping Tent Assembly
Example time line of activities and uncertainty rating for a single participant
Sample Videos
Abstract
This paper presents an outdoor video dataset annotated with action labels, collected from 24 29 participants wearing two head-mounted cameras (GoPro and SMI eye tracker) while assembling a camping tent. In total, this is 5.4 over 7 hours of recordings. Tent assembly includes manual interactions with non-rigid objects such as spreading the tent, securing guylines, reading instructions, and opening a tent bag. An interesting aspect of the dataset is that it reflects participants' proficiency in completing or understanding the task. This leads to participant differences in action sequences and action durations. Our dataset, called EPIC-Tent, also has several new types of annotations for two synchronised egocentic videos. These include task errors, self-rated uncertainty and gaze position, in addition to the task action labels. We present baseline results on the EPIC-Tent dataset using a state-of-the-art method for offline and online action recognition and detection.
Publication
Youngkyoon Jang, Brian Sullivan, Casimir Ludwig, Iain D. Gilchrist, Dima Damen, Walterio Mayol-Cuevas
EPIC-Tent: An Egocentric Video Dataset for Camping Tent Assembly
IEEE International Conference on Computer Vision Workshop (ICCVW), Seoul, South Korea, Oct. 27- Nov.02, 2019.
Workshop: The 5th International Workshop on Egocentric Perception, Interaction and Computing (EPIC@ICCV19)
Download: [pdf], [BibTeX]
Acknowledgement
This work is supported by UK EPSRC GLANCE (EP/N013964/1).
Link
Author URL: [Youngkyoon Jang], [Brian Sullivan], [Casimir Ludwig], [Iain D. Gilchrist], [Dima Damen], [Walterio Mayol-Cuevas]
Affiliation URL: [Visual Information Lab], [Computer Science], [Experimental Psychology], (University of Bristol)