计算机科学
异步通信
联合学习
方案(数学)
趋同(经济学)
异步学习
组分(热力学)
分布式计算
人工智能
合作学习
同步学习
计算机网络
数学分析
物理
数学
教学方法
政治学
法学
经济
热力学
经济增长
作者
Renhao Lu,Weizhe Zhang,Qiong Li,Hui He,Xiaoxiong Zhong,Hongwei Yang,Desheng Wang,Zenglin Xu,Mamoun Alazab
标识
DOI:10.1016/j.future.2023.11.001
摘要
Federated Learning enables data owners to train an artificial intelligence model collaboratively while keeping all the training data locally, reducing the possibility of personal data breaches. However, the heterogeneity of local resources and dynamic characteristics of federated learning systems bring new challenges hindering the development of federated learning techniques. To this end, we propose an Adaptive Asynchronous Federated Learning scheme with Momentum, called FedAAM, comprising an adaptive weight allocation algorithm and a novel asynchronous federated learning framework. Firstly, we dynamically allocate weights for the global model update using an adaptive weight allocation strategy that can improve the convergence rate of models in asynchronous federated learning systems. Then, targeting the challenges mentioned previously, we proposed two new asynchronous global update rules based on the differentiated strategy, which is an essential component of the proposed novel federated learning framework. Furthermore, our asynchronous federated learning framework introduces the historical global update direction (i.e., global momentum) into the global update operation, aiming at improving training efficiency. Moreover, we prove that the model under the FedAAM scheme can achieve a sublinear convergence rate. Extensive experiments on real-world datasets demonstrate that the FedAAM scheme outperforms representative synchronous and asynchronous federated learning schemes (i.e., FedAvg and FedAsync) regarding the model's convergence rate and capacity to deal with dynamic systems.
科研通智能强力驱动
Strongly Powered by AbleSci AI