计算机科学
变压器
图形
脑电图
语音识别
理论计算机科学
人工智能
模式识别(心理学)
心理学
工程类
电气工程
电压
精神科
作者
Huachao Yan,Kailing Guo,Xiaofen Xing,Xiangmin Xu
标识
DOI:10.1109/taffc.2024.3394873
摘要
In multichannel electroencephalograph (EEG) emotion recognition, most graph-based studies employ shallow graph model for spatial characteristics learning due to node over-smoothing caused by an increase in network depth. To address over-smoothing, we propose the bridge graph attention-based graph convolution network (BGAGCN). It bridges previous graph convolution layers to attention coefficients of the final layer by adaptively combining each graph convolution output based on the graph attention network, thereby enhancing feature distinctiveness. Considering that graph-based networks primarily focus on local EEG channel relationships, we introduce a transformer for global dependency. Inspired by the neuroscience finding that neural activities of different timescales reflect distinct spatial connectivities, we modify the transformer to a multi-scale transformer (MT) by applying multi-head attention to multichannel EEG signals after 1D convolutions at different scales. MT learns spatial features more elaborately to enhance feature representation ability. By combining BGAGCN and MT, our model BGAGCN-MT achieves state-of-the-art accuracy under subject-dependent and subject-independent protocols across three benchmark EEG emotion datasets (SEED, SEED-IV and DREAMER). Notably, our model effectively addresses over-smoothing in graph neural networks and provides an efficient solution to learning spatial relationships of EEG features at different scales. Our code is available at https://github.com/LogzZ .
科研通智能强力驱动
Strongly Powered by AbleSci AI