计算机科学
脑电图
相互信息
人工智能
模式识别(心理学)
最大化
不变(物理)
对抗制
机器学习
语音识别
数学
心理学
数学物理
精神科
数学优化
作者
Yingdong Wang,Qingfeng Wu,Shuocheng Wang,Xiqiao Fang,Qungsheng Ruan
标识
DOI:10.1016/j.eswa.2023.122777
摘要
EEG-based emotion classification is a vital aspect of human-machine interfaces. However, inter-subject variability poses a challenge for accurate domain-agnostic EEG emotion recognition, often requiring individual model calibration with a robust base model for fine-tuning. To overcome this limitation and develop a generalized model, we propose a Generalized Model based on Mutual Information for EEG Emotion Recognition without Adversarial Training (MI-EEG). The MI-EEG model leverages disentanglement to extract shared features, wherein it separates EEG features into domain-invariant class-relevant features and other features. To avoid adversarial training, mutual information minimization is applied during the decoupling process. Additionally, mutual information maximization is used to enrich the features by strengthening the relationship between domain-invariant class-relevant features and emotion labels. Furthermore, the transformer-based feature extractor, which utilizes a multi-headed attention mechanism and pooling operations, enhances the feature quality in the time dimension. The experimental evaluation on two emotional EEG datasets demonstrates the superior performance of the proposed EEG-MI model compared to existing state-of-the-art methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI