DeceFL: A Principled Decentralized Federated Learning Framework

计算机科学 联合学习 趋同(经济学) 随机梯度下降算法 功能(生物学) 脆弱性(计算) 分布式计算 管道(软件) 人工智能 机器学习 计算机安全 进化生物学 生物 人工神经网络 经济 程序设计语言 经济增长
作者
Y. Yuan,Jun Li,Dou Jin,Zuogong Yue,Ruijuan Chen,Maolin Wang,Chen Sun,Lei Xu,Hao Feng,Xin He,Xinlei Yi,Tao Yang,Haitao Zhang,Shaochun Sui,Dawei Han
出处
期刊:Cornell University - arXiv
标识
DOI:10.48550/arxiv.2107.07171
摘要

Traditional machine learning relies on a centralized data pipeline, i.e., data are provided to a central server for model training. In many applications, however, data are inherently fragmented. Such a decentralized nature of these databases presents the biggest challenge for collaboration: sending all decentralized datasets to a central server raises serious privacy concerns. Although there has been a joint effort in tackling such a critical issue by proposing privacy-preserving machine learning frameworks, such as federated learning, most state-of-the-art frameworks are built still in a centralized way, in which a central client is needed for collecting and distributing model information (instead of data itself) from every other client, leading to high communication pressure and high vulnerability when there exists a failure at or attack on the central client. Here we propose a principled decentralized federated learning algorithm (DeceFL), which does not require a central client and relies only on local information transmission between clients and their neighbors, representing a fully decentralized learning framework. It has been further proven that every client reaches the global minimum with zero performance gap and achieves the same convergence rate $O(1/T)$ (where $T$ is the number of iterations in gradient descent) as centralized federated learning when the loss function is smooth and strongly convex. Finally, the proposed algorithm has been applied to a number of applications to illustrate its effectiveness for both convex and nonconvex loss functions, demonstrating its applicability to a wide range of real-world medical and industrial applications.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
英姑应助自由人采纳,获得10
1秒前
丘比特应助李大力采纳,获得10
1秒前
2秒前
慕凝发布了新的文献求助10
2秒前
谢亚飞完成签到,获得积分10
3秒前
4秒前
美丽的飞飞完成签到,获得积分10
5秒前
黄小强发布了新的文献求助10
5秒前
曾经寄文完成签到 ,获得积分10
5秒前
6秒前
6秒前
6秒前
Akim应助艾岚采纳,获得10
6秒前
lalalala完成签到,获得积分10
6秒前
自信的若风完成签到,获得积分10
7秒前
冬无青山发布了新的文献求助20
8秒前
9秒前
biubiu发布了新的文献求助10
9秒前
9秒前
10秒前
boltos发布了新的文献求助10
10秒前
10秒前
suqinqin发布了新的文献求助10
11秒前
ll完成签到,获得积分10
11秒前
13秒前
高一发布了新的文献求助10
13秒前
小菜完成签到,获得积分10
13秒前
彩色的捕发布了新的文献求助10
13秒前
黄小强完成签到,获得积分10
14秒前
黄迪迪发布了新的文献求助10
15秒前
15秒前
海里完成签到 ,获得积分20
15秒前
juan完成签到,获得积分10
15秒前
乐乐应助小雨o0采纳,获得10
15秒前
15秒前
17秒前
18秒前
18秒前
哈哈哈发布了新的文献求助10
18秒前
小精灵fei完成签到,获得积分10
19秒前
高分求助中
The three stars each : the Astrolabes and related texts 1070
Manual of Clinical Microbiology, 4 Volume Set (ASM Books) 13th Edition 1000
Hieronymi Mercurialis Foroliviensis De arte gymnastica libri sex: In quibus exercitationum omnium vetustarum genera, loca, modi, facultates, & ... exercitationes pertinet diligenter explicatur Hardcover – 26 August 2016 900
Sport in der Antike 800
De arte gymnastica. The art of gymnastics 600
少脉山油柑叶的化学成分研究 530
Sport in der Antike Hardcover – March 1, 2015 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2403770
求助须知:如何正确求助?哪些是违规求助? 2102426
关于积分的说明 5305753
捐赠科研通 1830066
什么是DOI,文献DOI怎么找? 911955
版权声明 560458
科研通“疑难数据库(出版商)”最低求助积分说明 487619