The Unobtrusive Group Interaction (UGI) Corpus

Studying group dynamics requires fine-grained spatial and temporal understanding of human behavior. Social psychologists studying human interaction patterns in face-to-face group meetings often find themselves struggling with huge volumes of data that require many hours of tedious manual coding. There are only a few publicly available multi-modal datasets of face-to-face group meetings that enable the development of automated methods to study verbal and non-verbal human behavior. Here we present a new, publicly available multi-modal dataset for group dynamics study that differs from previous datasets in its use of ceiling-mounted, unobtrusive depth sensors. These can be used for fine-grained analysis of head and body pose and gestures, without any concerns about participants' privacy or inhibited behavior. The dataset is complemented by synchronized and time-stamped meeting transcripts that allow analysis of spoken content. The dataset comprises 22 group meetings in which participants perform a standard collaborative group task designed to measure leadership and productivity. Participants' post-task questionnaires, including demographic information, are also provided as part of the dataset. For details about the dataset, please refer to the following paper:

I. Bhattacharya, M. Foley, C. Ku, N. Zhang, T. Zhang, C. Mine, M. Li, H. Ji, C. Riedl, B. F. Welles, and R. J. Radke

"The Unobtrusive Group Interaction (UGI) Corpus", In Proceedings of the ACM Multimedia Systems. Amherst, MA, June 2019.

Sensing Infrastructure

The data was collected in a 11' x 28' conference room with two ceiling-mounted Microsoft Kinects and lapel microphones. We only collected the time-of-flight (ToF) distance measurements from the Kinects. The figure alongside shows (a) the instrumented meeting room, which contains (b) two ceiling-mounted Microsoft Kinects and per-participant lapel microphones. (c) the layout of a typical meeting.

Task-based Interaction

We recorded 86 participants across 22 groups who completed the Lunar Survival Task in the conference room with the sensing infrastructure as above. In small groups of 3-5, participants discuss a hypothetical survival scenario on the moon and rank the value of 15 supplies that may aid in their survival. The first stage of the task requires the participants to individually rank the 15 items in order of their importance, without communicating with the other members of the group. This stage takes 10 minutes. In the second stage, the participants work as a group to rank the items. The group has to employ the method of group consensus in reaching its decision within a maximum of 15 minutes.

After the group discussion, each participant is asked to complete a post-task questionnaire. In addition to questions relating to the age, race, and gender of the participants and whether they had ever completed the task before, the post-task questionnaire also asked the participants to rate on a 5-point scale (not at all, a little, somewhat, a lot, a great deal) the following questions:

  • How well did you know each of your group members before today?
  • Before today, how familiar were you personally with the topic of survival in space?
  • To what extent did the following group members contribute to the discussion?
  • To what extent did the following group members act as a group leader?
  • For each of the following pairs of words, please indicate the point on the scale that best represents your feelings about the group conversation: engaging-boring, warm-cold, comfortable-awkward, interesting-dull, friendly-detached.

The Dataset

The overhead distance measurements from the Kinects are synchronized with the individual participant lapel microphones. The distance maps are pre-processed for maximum dynamic range, and the speech signals are pre-processed to reduce noise. The speech signal is further transcribed to text using IBM Watson's Speech-to-Text API, and manually touched up to ensure correctness of the transcription process. The UGI corpus includes the following:

  • Two overhead distance video sequences for each meeting.
  • Anonymized, time-stamped meeting transcripts.
  • The Lunar Survival Task information sheet that was handed to each participant.
  • The Lunar Survival Task post-task questionnaire handed to each participant.
  • Answers to the post-task questionnaire, individual and group rankings.

The two Kinect distance maps with associated person tags for a frame in one meeting.

An example transcript snippet.

Download

Here is the Github repository: https://github.com/Indrani02/UGI_Corpus

The dataset can be downloaded at : https://zenodo.org/record/2644617#.XLiuYuhKiNc

Please cite the following paper in your publication when using this dataset for your research:

I. Bhattacharya, M. Foley, C. Ku, N. Zhang, T. Zhang, C. Mine, M. Li, H. Ji, C. Riedl, B. F. Welles, and R. J. Radke

"The Unobtrusive Group Interaction (UGI) Corpus", In Proceedings of the ACM Multimedia Systems. Amherst, MA, June 2019.