The Physiological Dataset for Multimodal Emotion Recognition with Personality as a Context (PhyMER) dataset consists of multimodal physiological data for Emotion recognition.
Access to the dataset is granted through two pathways:
1. Participation in the KERC2021 Challenge
Participation in the KERC2021 challenge automatically grants the permission to the users for using the data in the competition, whereby the participants comply to the usage, distribution and publication policy as stated during the competition.
2. Access Request
To access the dataset and download the files, please send an email to 218354@jnu.ac.kr, showing a motive to use the dataset for non-commercial academic research purpose. The download link will be provided via email.
The PhyMER dataset consists of the recorded physiological signals, participants' personality information, and emotion annotations in terms of Arousal, Valence, and seven basic emotions.
Modality: Electroencephalogram(EEG), Electrodermal Activity(EDA), Blood Pulse Volume(BVP), Heart Rate(HR), Peripheral Skin Temperature(TEMP), Personality Traits Labels:
Arousal and valence(1-9)
Seven Basic Emotions(Anger, Disgust, Fear, Happy, Neutral, Sad, Surprise)
The dataset folder consists of the following contents
E4 : contains the multiple modality data obtained from Empatica E4 wristband grouped by subject IDs
Emotiv: contains the EEG data obtained from the Emotiv EpocX headset
Personality: Personality Questionnaire, Big-5 Personality trait scores, and Personality score interpretation according to Newcastle Personality Assessor (NPA) questionnaire method.
The folder structure of the dataset is as illustrated below:
E4
The data is stored in CSV format and filenames follow the following structure:
experimentID_signalType_samplingFrequency.csv (experiment Id is formed by combining subject id and stimulus video id)
Emotiv
The data is stored in two different formats.
FIF format: as used by MNE Python
SET format: as used by EEGLab
Personality
Personality information was recorded using a Newcastle Personality Assessor (NPA) questionnaire method. The files are in CSV format:
big_5_personality_questionnaire_responses.csv: contains original questionnaire responses of the participants.
big_5_personality_traits.csv: contains the Big-5 personality trait interpretation of the questionnaire responses
personality_scores_NPA.csv: contains the NPA interpretation of the Big-5 personality traits in terms of low, medium-low, medium-high, and high.
Labels
Labels include self annotations of the participants for arousal, valence, and basic emotions.
labels.csv: (experiment_code, arousal, valence, emotion) contains the annotations by the participants for each stimulus video.
Missing Data
Due to device malfunction following samples are missing:
Eight samples for Subject SUB10(SUB10VID09-SUB10VID16) were excluded due to the device malfunctioning during the experiment.
Sudarshan Pant, Hyung-Jeong Yang, Eunchae Lim, Soo-Hyung Kim and Seok-Bong Yoo, "PhyMER: Physiological Dataset for Multimodal Emotion Recognition With Personality as a Context," in IEEE Access, vol. 11, pp. 107638-107656, 2023, doi: 10.1109/ACCESS.2023.3320053.
https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10265252
This work was supported by a National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT).