计算机科学
加速
上传
选择(遗传算法)
趋同(经济学)
收敛速度
压缩(物理)
匹配(统计)
分布式计算
机器学习
人工智能
并行计算
计算机网络
频道(广播)
统计
材料科学
数学
经济
复合材料
经济增长
操作系统
作者
Zhida Jiang,Yang Xu,Hongli Xu,Zhiyuan Wang,Qian Chen
标识
DOI:10.1109/infocom53939.2023.10229029
摘要
Federated learning (FL) allows multiple clients cooperatively train models without disclosing local data. However, the existing works fail to address all these practical concerns in FL: limited communication resources, dynamic network conditions and heterogeneous client properties, which slow down the convergence of FL. To tackle the above challenges, we propose a heterogeneity-aware FL framework, called FedCG, with adaptive client selection and gradient compression. Specifically, the parameter server (PS) selects a representative client subset considering statistical heterogeneity and sends the global model to them. After local training, these selected clients upload compressed model updates matching their capabilities to the PS for aggregation, which significantly alleviates the communication load and mitigates the straggler effect. We theoretically analyze the impact of both client selection and gradient compression on convergence performance. Guided by the derived convergence rate, we develop an iteration-based algorithm to jointly optimize client selection and compression ratio decision using submodular maximization and linear programming. Extensive experiments on both real-world prototypes and simulations show that FedCG can provide up to 5.3× speedup compared to other methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI