亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Model-level attention and batch-instance style normalization for federated learning on medical image segmentation

规范化(社会学) 计算机科学 分割 人工智能 箱子 领域(数学分析) 标记数据 原始数据 数据挖掘 深度学习 块(置换群论) 机器学习 模式识别(心理学) 算法 数学分析 社会学 数学 人类学 程序设计语言 几何学
作者
Fubao Zhu,Yanhui Tian,Chuang Han,Yanting Li,Jiaofen Nan,Yao Ni,Weihua Zhou
出处
期刊:Information Fusion [Elsevier BV]
卷期号:107: 102348-102348 被引量:12
标识
DOI:10.1016/j.inffus.2024.102348
摘要

Federated learning (FL) offers an effective privacy protection mechanism for cross-center medical collaboration and data sharing. In multi-site medical image segmentation, FL allows each medical site to act as a client, forming its own data domain. FL has the potential to enhance the performance of models on known domains. However, practical deployment faces the challenge of domain generalization (DG) due to the non-identical and non-independent (non-IID) nature of data from different domains. This results in decreased model performance in unseen domains. Current DG solutions are overly complex in addressing style differences and lack focus on inter-domain image features causing model differences. Furthermore, these solutions are not suitable for the FL paradigm that requires data storage separation. Hence, the lightweight model-level attention and batch-instance style normalization (MLA-BIN) is proposed to solve the DG of FL in this study. The MLA module represents the unseen domain as a linear combination of seen domain models. It does not require access to raw data but learns from the sufficient exploration of data features in known domains, thereby identifying differences in inter-domain data features and enabling the global model to generalize from seen to unseen domains. In the BIN block, batch normalization (BN) and instance normalization (IN) are combined to perform the shallow layers of the segmentation network for style normalization. By integrating the segmentation backbone network with the BIN block (BIN-Net), it ensures effective learning of intra-domain features and addresses the impact of inter-domain image style differences on domain generalization without accessing data from other centers. Extensive experimental results demonstrate that the proposed method achieved a Dice similarity coefficient of 88.27, 88.25 and 64.94 on the prostate, the optic disc and cup, and the COVID-19 lesion segmentation dataset, respectively, outperforming the state-of-the-art methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
JamesPei应助辞稚采纳,获得10
17秒前
ZYD完成签到 ,获得积分10
24秒前
chen完成签到 ,获得积分10
1分钟前
2分钟前
Nichols完成签到,获得积分10
2分钟前
2分钟前
2分钟前
辞稚发布了新的文献求助10
2分钟前
2分钟前
2分钟前
hahasun完成签到,获得积分10
2分钟前
小凯完成签到 ,获得积分10
2分钟前
LiuHD完成签到,获得积分10
3分钟前
专注的月亮完成签到,获得积分10
3分钟前
科研通AI2S应助科研通管家采纳,获得10
3分钟前
OsamaKareem应助科研通管家采纳,获得30
3分钟前
3分钟前
3分钟前
PG发布了新的文献求助10
4分钟前
4分钟前
Lucas应助PG采纳,获得10
4分钟前
MosesConey发布了新的文献求助10
4分钟前
4分钟前
Owen应助三倍美式采纳,获得50
4分钟前
zs发布了新的文献求助10
4分钟前
zs完成签到,获得积分20
4分钟前
希望天下0贩的0应助matrixu采纳,获得10
5分钟前
MadysonKotrba发布了新的文献求助10
5分钟前
尼古丁的味道完成签到 ,获得积分10
5分钟前
MadysonKotrba发布了新的文献求助10
5分钟前
MadysonKotrba发布了新的文献求助10
6分钟前
matrixu完成签到,获得积分10
6分钟前
6分钟前
matrixu发布了新的文献求助10
6分钟前
6分钟前
PG发布了新的文献求助10
6分钟前
vvcat完成签到,获得积分10
6分钟前
7分钟前
辞稚完成签到,获得积分10
7分钟前
Yini应助兼听则明采纳,获得50
7分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
The Cambridge History of China: Volume 4, Sui and T'ang China, 589–906 AD, Part Two 1500
Cowries - A Guide to the Gastropod Family Cypraeidae 1200
Quality by Design - An Indispensable Approach to Accelerate Biopharmaceutical Product Development 800
Pulse width control of a 3-phase inverter with non sinusoidal phase voltages 777
Signals, Systems, and Signal Processing 610
Research Methods for Applied Linguistics: A Practical Guide 600
热门求助领域 (近24小时)
化学 材料科学 医学 生物 纳米技术 工程类 有机化学 化学工程 生物化学 计算机科学 物理 内科学 复合材料 催化作用 物理化学 光电子学 电极 细胞生物学 基因 无机化学
热门帖子
关注 科研通微信公众号,转发送积分 6399278
求助须知:如何正确求助?哪些是违规求助? 8215084
关于积分的说明 17407606
捐赠科研通 5452618
什么是DOI,文献DOI怎么找? 2881845
邀请新用户注册赠送积分活动 1858293
关于科研通互助平台的介绍 1700300