Cross-dataset transfer learning for motor imagery signal classification via multi-task learning and pre-training

计算机科学 学习迁移 人工智能 机器学习 稳健性(进化) 运动表象 任务(项目管理) 多任务学习 模式识别(心理学) 脑电图 脑-机接口 精神科 基因 生物化学 经济 化学 管理 心理学
作者
Yuting Xie,Kun Wang,Jiayuan Meng,Yue Jin,Lin Meng,Weibo Yi,Tzyy‐Ping Jung,Minpeng Xu,Ming Dong
出处
期刊:Journal of Neural Engineering [IOP Publishing]
卷期号:20 (5): 056037-056037 被引量:23
标识
DOI:10.1088/1741-2552/acfe9c
摘要

Objective.Deep learning (DL) models have been proven to be effective in decoding motor imagery (MI) signals in Electroencephalogram (EEG) data. However, DL models' success relies heavily on large amounts of training data, whereas EEG data collection is laborious and time-consuming. Recently, cross-dataset transfer learning has emerged as a promising approach to meet the data requirements of DL models. Nevertheless, transferring knowledge across datasets involving different MI tasks remains a significant challenge in cross-dataset transfer learning, limiting the full utilization of valuable data resources.This study proposes a pre-training-based cross-dataset transfer learning method inspired by Hard Parameter Sharing in multi-task learning. Different datasets with distinct MI paradigms are considered as different tasks, classified with shared feature extraction layers and individual task-specific layers to allow cross-dataset classification with one unified model. Then, Pre-training and fine-tuning are employed to transfer knowledge across datasets. We also designed four fine-tuning schemes and conducted extensive experiments on them.The results showed that compared to models without pre-training, models with pre-training achieved a maximum increase in accuracy of 7.76%. Moreover, when limited training data were available, the pre-training method significantly improved DL model's accuracy by 27.34% at most. The experiments also revealed that pre-trained models exhibit faster convergence and remarkable robustness. The training time per subject could be reduced by up to 102.83 s, and the variance of classification accuracy decreased by 75.22% at best.This study represents the first comprehensive investigation of the cross-dataset transfer learning method between two datasets with different MI tasks. The proposed pre-training method requires only minimal fine-tuning data when applying DL models to new MI paradigms, making MI-Brain-computer interface more practical and user-friendly.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
深情安青应助smy采纳,获得10
刚刚
1秒前
1秒前
无花果应助嘘嘘采纳,获得10
2秒前
lizhiqian2024发布了新的文献求助10
2秒前
时尚以南发布了新的文献求助10
2秒前
Akim应助迷虫采纳,获得10
2秒前
Yunyun完成签到,获得积分10
3秒前
十八发布了新的文献求助10
3秒前
黑子哥完成签到,获得积分10
4秒前
希望天下0贩的0应助晏晏采纳,获得10
4秒前
4秒前
CipherSage应助Mmxn采纳,获得10
4秒前
hhh456完成签到 ,获得积分10
5秒前
5秒前
FashionBoy应助direstyles采纳,获得10
5秒前
6秒前
6秒前
打打应助rauldai采纳,获得10
6秒前
zcydbttj2011发布了新的文献求助10
6秒前
6秒前
7秒前
贤惠的迎夏完成签到,获得积分10
7秒前
酷波er应助lishuang5采纳,获得10
7秒前
后来应助期待未来的自己采纳,获得10
7秒前
8秒前
36456657发布了新的文献求助10
9秒前
liusu完成签到,获得积分10
9秒前
LY完成签到,获得积分10
9秒前
10秒前
欢呼妙彤发布了新的文献求助10
10秒前
卡卡西应助jy采纳,获得20
10秒前
负责冰海完成签到,获得积分10
11秒前
11秒前
11秒前
迷虫完成签到,获得积分20
12秒前
共享精神应助huhuhu采纳,获得10
12秒前
蔡伟伦发布了新的文献求助10
12秒前
13秒前
科研通AI2S应助小萝卜采纳,获得10
13秒前
高分求助中
Encyclopedia of Mathematical Physics 2nd edition 888
Technologies supporting mass customization of apparel: A pilot project 600
材料概论 周达飞 ppt 500
Nonrandom distribution of the endogenous retroviral regulatory elements HERV-K LTR on human chromosome 22 500
Hydropower Nation: Dams, Energy, and Political Changes in Twentieth-Century China 500
Introduction to Strong Mixing Conditions Volumes 1-3 500
Optical and electric properties of monocrystalline synthetic diamond irradiated by neutrons 320
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3805997
求助须知:如何正确求助?哪些是违规求助? 3350835
关于积分的说明 10351617
捐赠科研通 3066714
什么是DOI,文献DOI怎么找? 1684126
邀请新用户注册赠送积分活动 809309
科研通“疑难数据库(出版商)”最低求助积分说明 765432