Cross-dataset transfer learning for motor imagery signal classification via multi-task learning and pre-training

计算机科学 学习迁移 人工智能 机器学习 稳健性(进化) 运动表象 任务(项目管理) 多任务学习 模式识别(心理学) 脑电图 脑-机接口 心理学 生物化学 化学 管理 精神科 经济 基因
作者
Yuting Xie,Kun Wang,Jiayuan Meng,Yue Jin,Lin Meng,Weibo Yi,Tzyy‐Ping Jung,Minpeng Xu,Ming Dong
出处
期刊:Journal of Neural Engineering [IOP Publishing]
卷期号:20 (5): 056037-056037 被引量:32
标识
DOI:10.1088/1741-2552/acfe9c
摘要

Objective.Deep learning (DL) models have been proven to be effective in decoding motor imagery (MI) signals in Electroencephalogram (EEG) data. However, DL models' success relies heavily on large amounts of training data, whereas EEG data collection is laborious and time-consuming. Recently, cross-dataset transfer learning has emerged as a promising approach to meet the data requirements of DL models. Nevertheless, transferring knowledge across datasets involving different MI tasks remains a significant challenge in cross-dataset transfer learning, limiting the full utilization of valuable data resources.This study proposes a pre-training-based cross-dataset transfer learning method inspired by Hard Parameter Sharing in multi-task learning. Different datasets with distinct MI paradigms are considered as different tasks, classified with shared feature extraction layers and individual task-specific layers to allow cross-dataset classification with one unified model. Then, Pre-training and fine-tuning are employed to transfer knowledge across datasets. We also designed four fine-tuning schemes and conducted extensive experiments on them.The results showed that compared to models without pre-training, models with pre-training achieved a maximum increase in accuracy of 7.76%. Moreover, when limited training data were available, the pre-training method significantly improved DL model's accuracy by 27.34% at most. The experiments also revealed that pre-trained models exhibit faster convergence and remarkable robustness. The training time per subject could be reduced by up to 102.83 s, and the variance of classification accuracy decreased by 75.22% at best.This study represents the first comprehensive investigation of the cross-dataset transfer learning method between two datasets with different MI tasks. The proposed pre-training method requires only minimal fine-tuning data when applying DL models to new MI paradigms, making MI-Brain-computer interface more practical and user-friendly.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
量子星尘发布了新的文献求助10
刚刚
azen发布了新的文献求助10
刚刚
充电宝应助张凌志采纳,获得10
刚刚
小y发布了新的文献求助10
1秒前
Akim应助顺顺尼采纳,获得10
1秒前
wanci应助clp采纳,获得10
2秒前
在水一方应助张学友采纳,获得10
2秒前
Placebo关注了科研通微信公众号
2秒前
无花果应助研友_ZeoKYL采纳,获得10
2秒前
麦克尔完成签到,获得积分10
2秒前
顾瑞关注了科研通微信公众号
2秒前
搜集达人应助pkaq采纳,获得10
3秒前
科研通AI6应助YUYUYU采纳,获得10
4秒前
轻轻的吻完成签到 ,获得积分10
5秒前
123发布了新的文献求助10
5秒前
FashionBoy应助权_888采纳,获得10
5秒前
冷傲的如柏完成签到,获得积分10
7秒前
故意的鼠标完成签到,获得积分10
8秒前
传奇3应助常姗采纳,获得10
8秒前
粗犷的世平完成签到,获得积分10
8秒前
9秒前
9秒前
CodeCraft应助2123121321321采纳,获得10
10秒前
紫苏桃子姜完成签到,获得积分10
10秒前
李爱国应助lyz采纳,获得10
11秒前
azen完成签到,获得积分20
11秒前
完美世界应助HY采纳,获得10
12秒前
钟小熊完成签到,获得积分10
12秒前
12秒前
搜集达人应助秀莉采纳,获得10
12秒前
香蕉觅云应助威武的听露采纳,获得10
12秒前
RR完成签到,获得积分10
13秒前
风中海秋完成签到,获得积分10
14秒前
卢建烨完成签到,获得积分10
14秒前
小马甲应助遇见馅儿饼采纳,获得10
14秒前
池台下完成签到,获得积分10
14秒前
14秒前
clp发布了新的文献求助10
15秒前
15秒前
15秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Einführung in die Rechtsphilosophie und Rechtstheorie der Gegenwart 1500
NMR in Plants and Soils: New Developments in Time-domain NMR and Imaging 600
Electrochemistry: Volume 17 600
La cage des méridiens. La littérature et l’art contemporain face à la globalisation 577
Practical Invisalign Mechanics: Crowding 500
Practical Invisalign Mechanics: Deep Bite and Class II Correction 500
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 4954783
求助须知:如何正确求助?哪些是违规求助? 4217083
关于积分的说明 13122349
捐赠科研通 3999304
什么是DOI,文献DOI怎么找? 2188752
邀请新用户注册赠送积分活动 1203861
关于科研通互助平台的介绍 1116143