计算机科学
独立同分布随机变量
GSM演进的增强数据速率
分歧(语言学)
联合学习
人工神经网络
协议(科学)
人工智能
机器学习
降级(电信)
选择(遗传算法)
数据挖掘
数学
病理
替代医学
医学
统计
随机变量
语言学
电信
哲学
作者
Wenyu Zhang,Xiumin Wang,Pan Zhou,Weiwei Wu,Xinglin Zhang
出处
期刊:IEEE Access
[Institute of Electrical and Electronics Engineers]
日期:2021-01-01
卷期号:9: 24462-24474
被引量:208
标识
DOI:10.1109/access.2021.3056919
摘要
Federated Learning (FL) has recently attracted considerable attention in internet of things, due to its capability of enabling mobile clients to collaboratively learn a global prediction model without sharing their privacy-sensitive data to the server. Despite its great potential, a main challenge of FL is that the training data are usually non-Independent, Identically Distributed (non-IID) on the clients, which may bring the biases in the model training and cause possible accuracy degradation. To address this issue, this paper aims to propose a novel FL algorithm to alleviate the accuracy degradation caused by non-IID data at clients. Firstly, we observe that the clients with different degrees of non-IID data present heterogeneous weight divergence with the clients owning IID data. Inspired by this, we utilize weight divergence to recognize the non-IID degrees of clients. Then, we propose an efficient FL algorithm, named CSFedAvg, in which the clients with lower degree of non-IID data will be chosen to train the models with higher frequency. Finally, we conduct simulations using publicly-available datasets to train deep neural networks. Simulation results show that the proposed FL algorithm improves the training performance compared with existing FL protocol.
科研通智能强力驱动
Strongly Powered by AbleSci AI