端到端原则
脑电图
情绪识别
计算机科学
心理学
语音识别
人工智能
情绪分类
情感科学
认知心理学
神经科学
作者
Jun Xiao,Feifei Qi,Wang Li,Yanbin He,Jin-Gang Yu,Wei Wu,Zhuliang Yu,Yuanqing Li,Zhenghui Gu,Tianyou Yu
标识
DOI:10.1109/taffc.2025.3581388
摘要
Emotion recognition from EEG signals offers significant advantages in affective computing, as EEG more accurately reflects internal emotional states than other modalities, such as facial expressions or peripheral physiological signals. Modeling and capturing subtle affective changes over time is crucial for real-world applications to achieve better human-computer interaction. However, training such models usually requires segment-level emotion labels, which are costly and may not be feasible. Assigning the overall label to all EEG segments within a trial can lead to inaccurate model training and degraded performance, as emotions evolve continuously. This highlights the need for models capable of learning from trial-wise emotion labels while capturing temporal dynamics of emotional responses within each segment because trial-wise post-stimulus labels are more accessible. To this end, we propose EmotionMIL, an end-to-end EEG-based emotion recognition framework that leverages recent advances in deep multiple instance learning (MIL). This framework enables robust emotion recognition from weakly labeled EEG signals and identifies the most prominent emotional responses. EmotionMIL captures the temporal dynamics of emotions using a retentive self-attention mechanism, which adaptively assigns weights to EEG segments based on their relevance in predicting the overall emotion label. A pseudo-bag augmentation strategy is also introduced to enhance the model's generalization ability by generating additional pseudo-bags from the original ones. Evaluated on three benchmark datasets—DEAP, DREAMER, and SEED—EmotionMIL outperforms state-of-the-art non-MIL and MIL models in both subject-dependent and subject-independent tasks, achieving superior accuracy and F1-score. Ablation study further validates the model design, while visualization results demonstrate that EmotionMIL effectively identifies both spatial EEG patterns and temporal emotional dynamics. These findings underscore EmotionMIL's potential for robust, interpretable emotion recognition, paving the way for real-world applications in emotion-aware systems. The code is available at https://github.com/yuty2009/emotionmil.
科研通智能强力驱动
Strongly Powered by AbleSci AI