Emotion plays a vital role in human health and many aspects of life, including relationships, behaviors and decision-making. An intelligent emotion recognition system may provide a flexible method to monitor emotion changes in daily life and send warning information when unusual/unhealthy emotional states occur. Here, we proposed a novel unsupervised learning-based emotion recognition system in an attempt to decode emotional states from electroencephalography (EEG) signals. Four dimensions of human emotions were examined: arousal, valence, dominance and liking. To better characterize the trials in terms of EEG features, we used hypergraph theory. Emotion recognition was realized through hypergraph partitioning, which divided the EEG-based hypergraph into a specific number of clusters, with each cluster indicating one of the emotion classes and vertices (trials) in the same cluster sharing similar emotion properties. Comparison of the proposed unsupervised learning-based emotion recognition system with other recognition systems using a well-known public emotion database clearly demonstrated the validity of the proposed system.
Our paper:
Z. Liang, S. Oba, and S. Ishii, An Unsupervised EEG Decoding System for Human Emotion Recognition, Neural Networks, 116, pp. 257-268, 2019.
DEAP Database:
The used emotion database in this paper is available here.
*click the figures below to see high-resolution version
The EEG electrode locations for all the data were arranged in the Geneva order as shown below:
The calculated IAF values for each subject across 40 trials in the DEAP database . These results also reflected Klimesch’s idea that the center of the alpha frequency band changed with time and with subject. It was observed that almost IAF values fluctuated around approximately 10, and the variances for one subject across all trials were in the range of 0.0034 to 0.0493.
Regarding the calculated IAF values, the edges of frequency bands can be dynamically defined as:
EEG signals were decomposed by using the mother wavelet ‘Coif1’ up to level 7. In each level, the decomposition results included the approximations (A) and detail coefficients (D). Then, the energy and Shannon entropy properties were extracted from the detail coefficients at decomposition levels from 4 to 7.
To cross-compare the results obtained in this study with those from other existing studies that used the DEAP database, the 9-point subjective feedback in four dimensions of emotion was first discretized into two classes using a fixed threshold of 5.
The ratios of low and high classes (low/high) in each dimension of emotion were as follows: arousal (0.41/0.59), valence (0.43/0.57), dominance (0.38/0.62), and liking (0.33/0.67). The details of the ratios of low and high classes for each subject and each dimension of emotion are as follows:
We also verified the emotion recognition results with and without the use of UDFS and/or KPCA. The information below implies that compared to the recognition of dominance and liking, the recognition of arousal and valence was more sensitive to the feature selection and extraction part, where an obvious decrease in the decoding performance was observed. This observation revealed that the selection preferences of features may vary depending on the nature of emotions, which should be considered in the design of future emotion decoding systems.