计算机科学
趋同(经济学)
联合学习
过程(计算)
链条(单位)
遮罩(插图)
数据聚合器
信息隐私
人工智能
分布式计算
机器学习
计算机网络
理论计算机科学
计算机安全
天文
物理
操作系统
艺术
视觉艺术
经济
经济增长
无线传感器网络
作者
Yingchun Cui,Jinghua Zhu
标识
DOI:10.1007/978-3-031-30637-2_46
摘要
Federated Learning is a promising machine learning paradigm for collaborative learning while preserving data privacy. However, attackers can derive the original sensitive data from the model parameters in Federated Learning with the central server because model parameters might leak once the server is attacked. To solve the above server attack challenge, in this paper, we propose a novel server-free Federated Learning framework named MChain-SFFL which performs multi-chain parallel communication in a fully distributed way to update the model to achieve more secure privacy protection. Specifically, MChain-SFFL first randomly selects multiple participants as the chain heads to initiate the model parameter aggregation process. Then MChain-SFFL leverages the single-masking and chained-communication mechanisms to transfer the masked information between participants within each serial chain. In this way, the masked local model parameters are gradually aggregated along the chain nodes. Finally, each chain head broadcasts the aggregated local model to the other nodes and this propagation process stops until convergence. The experimental results demonstrate that for Non-IID data, MChain-SFFL outperforms the compared methods in model accuracy and convergence speed. For IID data, the accuracy and convergence speed of MChain-SFFL are close to Chain-PPFL and FedAVG.
科研通智能强力驱动
Strongly Powered by AbleSci AI