计算机科学
异步通信
分布式计算
同步(交流)
MNIST数据库
互联网
联合学习
计算机网络
数据聚合器
带宽(计算)
人工智能
无线传感器网络
深度学习
频道(广播)
万维网
作者
Zhigang Yang,Xuhua Zhang,Dapeng Wang,Ruyan Wang,Puning Zhang,Yu Wu
出处
期刊:IEEE Internet of Things Journal
[Institute of Electrical and Electronics Engineers]
日期:2023-05-01
卷期号:10 (9): 7737-7748
被引量:1
标识
DOI:10.1109/jiot.2022.3230412
摘要
Federated learning (FL) is a distributed machine learning paradigm that ensures data do not leave local devices. Data sharing problems can be addressed by FL in untrusted environments, e.g., the Internet of Vehicles (IoV). However, FL needs to frequently exchange massive parameters to achieve preset model goals. In addition, the change in bandwidths and the delay of data communications due to user mobility challenge the synchronization of model parameters. In this article, an efficient hierarchical asynchronous FL (EHAFL) algorithm is proposed to adjust the encoding length dynamically according to the bandwidth and reduce the communication cost substantially. A dynamic hierarchical asynchronous aggregation mechanism is proposed leveraging gradient sparsification and asynchronous aggregation techniques to further reduce the communication costs and improve the aggregation efficiency of the global model. Simulation results on MNIST and real-world data sets show that our proposed solution can reduce the communication costs by 98% while only compromising the model accuracy by 1%.
科研通智能强力驱动
Strongly Powered by AbleSci AI