计算机科学
脑电图
脑-机接口
变压器
语音识别
卷积神经网络
人工智能
神经科学
电气工程
电压
工程类
心理学
作者
Yi Ding,Yong Li,Hao Sun,Rui Liu,Chengxuan Tong,Chenyu Liu,Xinliang Zhou,Cuntai Guan
标识
DOI:10.1109/jbhi.2024.3504604
摘要
Effectively learning the temporal dynamics in electroencephalogram (EEG) signals is challenging yet essential for decoding brain activities using brain-computer interfaces (BCIs). Although Transformers are popular for their long-term sequential learning ability in the BCI field, most methods combining Transformers with convolutional neural networks (CNNs) fail to capture the coarse-to-fine temporal dynamics of EEG signals. To overcome this limitation, we introduce EEG-Deformer, which incorporates two main novel components into a CNN-Transformer: (1) a Hierarchical Coarse-to-Fine Transformer (HCT) block that integrates a Fine-grained Temporal Learning (FTL) branch into Transformers, effectively discerning coarse-to-fine temporal patterns; and (2) a Dense Information Purification (DIP) module, which utilizes multi-level, purified temporal information to enhance decoding accuracy. Comprehensive experiments on three representative cognitive tasksâcognitive attention, driving fatigue, and mental workload detectionâconsistently confirm the generalizability of our proposed EEG-Deformer, demonstrating that it either outperforms or performs comparably to existing state-of-the-art methods. Visualization results show that EEG-Deformer learns from neurophysiologically meaningful brain regions for the corresponding cognitive tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI