Decoding Models of Human Emotion Using Brain Signals
Affective intelligence is becoming a growing important component in the development of artificial intelligence, which plays a critical role in the automatic and interactive processes such as human-computer interaction and human-robot interaction. The need for technologies that are capable of automatically, dynamically, and reliably decoding human emotions is increasing dramatically. However, due to the complexity and diversity of human emotions, it is still difficult to accurately estimate emotion. From a neurophysiological view, brain signals provide a more natural, direct, and objective approach to decode the real human emotions of individuals. In recent years, emotion decoding from brain signals has become an attractive and purposeful research topic. Various brain recording techniques, such as electroencephalography (EEG), functional Magnetic Resonance Imaging (fMRI), functional Near-Infrared Spectroscopy (fNIRS), and magnetoencephalography (MEG), have been introduced to differentiate the underlying emotions from the recorded ongoing brain signals, where the tremendous potentials in emotion decoding have been widely evidenced. Considering the contaminated noises in the brain signals and the existence of individual differences, there still needs great effort to propose more valid, reliable, and practical emotion decoding models, and more comprehensive and dedicated model explorations and comparisons should be conducted. This workshop seeks various new feature extraction and modeling methods that could help to improve the performance of emotion decoding from brain signals.
This workshop would welcome contributions, including but not limited to
· Affective computing using brain recording methods
· Feature extraction and selection from brain signals for emotion recognition
· New machine and deep learning methods for brain decoding modeling
· Emotion-related underlying neural mechanism studies
· Visualization and explanation of emotion-wise brain patterns
· Brain-computer interfaces
· Meta-analysis and research reviews on affective indigence
Date & Location:
10th November 2022, Room S: Keynote Hall,Juss Hengshan Hotel (久事衡山酒店)
Website: http://jiushihengshanhotel.com.cn/index1.html
Address: 516 Hengshan road, Shanghai, China
Workshop Schedule (1:00pm - 4:30pm):
Time / Participants / Affiliation / Presentation Type / Presentation Title
1:00 pm – 1:10 pm: Opening
1:10 pm – 1:40 pm: Baoliang Lu, Shanghai Jiao Tong University, invited talk,
New emotional stimuli for affective brain-computer interface: oil paintings
1:40 pm – 2:10 pm: Dongrui Wu, Huazhong University of Science and Technology, invited talk,
Affective Brain-Computer Interface: A Tutorial
2:10 pm – 2:40 pm: Dan Zhang, Tsinghua University, invited talk,
Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition
2:40 pm - 2:50 pm: Break
2:50 pm - 3:10 pm: Jiadong Zhou, Hiroshi Higashi, and Shin Ishii, Kyoto University, paper presentation,
Data Generation for Missing Frequencies in SSVEP-based Brain Computer Interfaces
3:10 pm - 3:30 pm: Yangsong Zhang, Li Nie, and Yudong Pan, Southwest University of Science and Technology, paper presentation,
Two Deep Learning-based Methods for Brain-Computer Interface
3:30 pm - 3:50 pm: Shiang Hu, Zhao Lv, Di Xiao, and Juan Hou, Anhui University, paper presentation,
Thoughts from the Systematic Research Paradigm: A Progressive Summary of EEG based Emotion Recognition
3:50 pm - 4:10 pm: Xiaopeng Si, Sicheng Li, Yulin Sun, Dong Huang, Yongpei Jian, Jiayue Yu, and Dong Ming, Tianjin University, paper presentation,
Multimodal Neuroimaging for Understanding Human Emotion Network and Intelligent Emotion Decoding
4:10 pm - 4:30 pm: Zhen Liang, Weishan Ye, and Zhiguo Zhang, Shenzhen University, paper presentation,
Pairwise learning-based emotion recognition using EEG signals
Zoom link:
https://utas.zoom.us/j/81727659366
Paper Submission:
Please submit your papers to Zhen Liang janezliang@szu.edu.cn or Zhiguo Zhang zhiguozhang@hit.edu.cn . All papers should be submitted electronically in PDF format and formatted using the Springer LNAI template. For more instructions about the paper preparation information, please find it at https://www.pricai.org/2022/submission .
Paper Submission: August 1, 2022
Notification: August 15, 2022
Camera Ready: August 31, 2022
Conference: November 10-13, 2022