计算机科学
联合学习
利用
计算
对称多处理机系统
地铁列车时刻表
人工智能
机器学习
分布式计算
计算机安全
算法
操作系统
作者
Kongyang Chen,Xiaoxue Zhang,Xiuhua Zhou,Bing Mi,Yatie Xiao,Lei Zhou,Zhen Wu,Lin Wu,Xiaoying Wang
标识
DOI:10.1016/j.isatra.2023.04.020
摘要
Federated learning is a novel distribute machine learning paradigm to support cooperative model training among multiple participant clients, where each client keeps its private data locally to protect its data privacy. However, in practical application domains, Federated learning still meets several heterogeneous challenges such data heterogeneity, model heterogeneity, and computation heterogeneity, significantly decreasing its global model performance. To the best of our knowledge, existing solutions only focus on one or two challenges in their heterogeneous settings. In this paper, to address the above challenges simultaneously, we present a novel solution called Full Heterogeneous Federated Learning (FHFL). Firstly, we propose a synthetic data generation approach to mitigate the Non-IID data heterogeneity problem. Secondly, we use knowledge distillation to learn from heterogeneous models of participant clients for model aggregation in the central server. Finally, we produce an opportunistic computation schedule strategy to exploit the idle computation resources for fast-computing clients. Experiment results on different datasets show that our FHFL method can achieve an excellent model training performance. We believe it will serve as a pioneer work for distributed model training among heterogeneous clients in Federated learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI