脑电图
解码方法
功能连接
振幅
计算机科学
融合
人工智能
相(物质)
模式识别(心理学)
语音识别
神经科学
心理学
算法
物理
量子力学
语言学
哲学
作者
Liangliang Hu,Congming Tan,Jiayang Xu,Rui Qiao,Yilin Hu,Yin Tian
出处
期刊:Neural Networks
[Elsevier BV]
日期:2024-02-01
卷期号:172: 106148-106148
被引量:8
标识
DOI:10.1016/j.neunet.2024.106148
摘要
Decoding emotional neural representations from the electroencephalographic (EEG)-based functional connectivity network (FCN) is of great scientific importance for uncovering emotional cognition mechanisms and developing harmonious human–computer interactions. However, existing methods mainly rely on phase-based FCN measures (e.g., phase locking value [PLV]) to capture dynamic interactions between brain oscillations in emotional states, which fail to reflect the energy fluctuation of cortical oscillations over time. In this study, we initially examined the efficacy of amplitude-based functional networks (e.g., amplitude envelope correlation [AEC]) in representing emotional states. Subsequently, we proposed an efficient phase–amplitude fusion framework (PAF) to fuse PLV and AEC and used common spatial pattern (CSP) to extract fused spatial topological features from PAF for multi-class emotion recognition. We conducted extensive experiments on the DEAP and MAHNOB-HCI datasets. The results showed that: (1) AEC-derived discriminative spatial network topological features possess the ability to characterize emotional states, and the differential network patterns of AEC reflect dynamic interactions in brain regions associated with emotional cognition. (2) The proposed fusion features outperformed other state-of-the-art methods in terms of classification accuracy for both datasets. Moreover, the spatial filter learned from PAF is separable and interpretable, enabling a description of affective activation patterns from both phase and amplitude perspectives.
科研通智能强力驱动
Strongly Powered by AbleSci AI