已入深夜,您辛苦了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!祝你早点完成任务,早点休息,好梦!

Multi-modal deep learning from imaging genomic data for schizophrenia classification

人工智能 功能磁共振成像 神经影像学 卷积神经网络 分类器(UML) 计算机科学 深度学习 模式识别(心理学) 机器学习 心理学 神经科学
作者
Ayush Kanyal,Badhan Mazumder,Vince D. Calhoun,Adrian Preda,Jessica Turner,Judith M. Ford,Dong Hye Ye
出处
期刊:Frontiers in Psychiatry [Frontiers Media]
卷期号:15 被引量:1
标识
DOI:10.3389/fpsyt.2024.1384842
摘要

Background Schizophrenia (SZ) is a psychiatric condition that adversely affects an individual’s cognitive, emotional, and behavioral aspects. The etiology of SZ, although extensively studied, remains unclear, as multiple factors come together to contribute toward its development. There is a consistent body of evidence documenting the presence of structural and functional deviations in the brains of individuals with SZ. Moreover, the hereditary aspect of SZ is supported by the significant involvement of genomics markers. Therefore, the need to investigate SZ from a multi-modal perspective and develop approaches for improved detection arises. Methods Our proposed method employed a deep learning framework combining features from structural magnetic resonance imaging (sMRI), functional magnetic resonance imaging (fMRI), and genetic markers such as single nucleotide polymorphism (SNP). For sMRI, we used a pre-trained DenseNet to extract the morphological features. To identify the most relevant functional connections in fMRI and SNPs linked to SZ, we applied a 1-dimensional convolutional neural network (CNN) followed by layerwise relevance propagation (LRP). Finally, we concatenated these obtained features across modalities and fed them to the extreme gradient boosting (XGBoost) tree-based classifier to classify SZ from healthy control (HC). Results Experimental evaluation on clinical dataset demonstrated that, compared to the outcomes obtained from each modality individually, our proposed multi-modal approach performed classification of SZ individuals from HC with an improved accuracy of 79.01%. Conclusion We proposed a deep learning based framework that selects multi-modal (sMRI, fMRI and genetic) features efficiently and fuse them to obtain improved classification scores. Additionally, by using Explainable AI (XAI), we were able to pinpoint and validate significant functional network connections and SNPs that contributed the most toward SZ classification, providing necessary interpretation behind our findings.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
qq发布了新的文献求助10
2秒前
HF7发布了新的文献求助10
3秒前
迦鳞完成签到 ,获得积分10
4秒前
斯文败类应助HF7采纳,获得10
7秒前
酷波er应助知足的憨人*-*采纳,获得10
9秒前
zhanzhanzhan发布了新的文献求助10
10秒前
ming完成签到,获得积分20
11秒前
12秒前
14秒前
超帅的心锁完成签到,获得积分20
15秒前
Dyying应助Achange采纳,获得10
15秒前
等待若山发布了新的文献求助10
16秒前
jietaocn完成签到 ,获得积分10
17秒前
ab发布了新的文献求助10
19秒前
20秒前
小白又鹏发布了新的文献求助10
20秒前
排骨炖豆角完成签到 ,获得积分10
22秒前
等待若山完成签到,获得积分10
26秒前
27秒前
科研通AI2S应助yang采纳,获得30
30秒前
小白又鹏完成签到,获得积分10
31秒前
33秒前
SAN关闭了SAN文献求助
35秒前
七年发布了新的文献求助10
36秒前
阿童木完成签到 ,获得积分10
38秒前
SciGPT应助bbbabo采纳,获得10
39秒前
xff关闭了xff文献求助
40秒前
m(_._)m完成签到 ,获得积分0
41秒前
46秒前
共享精神应助nn采纳,获得10
46秒前
共享精神应助开朗的尔风采纳,获得30
49秒前
btsforever完成签到 ,获得积分10
50秒前
bbbabo发布了新的文献求助10
51秒前
Mental完成签到,获得积分10
51秒前
今后应助读书的时候采纳,获得30
52秒前
张晓娜完成签到 ,获得积分10
52秒前
sisyphus_yy完成签到 ,获得积分10
54秒前
专注的芷完成签到 ,获得积分10
55秒前
量子星尘发布了新的文献求助10
1分钟前
开朗的尔风完成签到,获得积分20
1分钟前
高分求助中
Semantics for Latin: An Introduction 1055
Genomic signature of non-random mating in human complex traits 1000
Plutonium Handbook 1000
Three plays : drama 1000
Robot-supported joining of reinforcement textiles with one-sided sewing heads 600
SPSS for Windows Step by Step: A Simple Study Guide and Reference, 17.0 Update (10th Edition) 500
Multimodal injustices: Speech acts, gender bias, and speaker’s status 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4104844
求助须知:如何正确求助?哪些是违规求助? 3642662
关于积分的说明 11541508
捐赠科研通 3350556
什么是DOI,文献DOI怎么找? 1840911
邀请新用户注册赠送积分活动 907801
科研通“疑难数据库(出版商)”最低求助积分说明 824964