计算机科学
修剪
水准点(测量)
边缘设备
人工智能
计算
GSM演进的增强数据速率
机器学习
深度学习
架空(工程)
云计算
算法
农学
生物
大地测量学
地理
操作系统
作者
Jiahao Du,Na Qin,Deqing Huang,Xinming Jia,Yiming Zhang
标识
DOI:10.1109/tim.2023.3328073
摘要
Due to data security concerns, federated learning (FL) has significant computation and communication costs, which lowers total training effectiveness. This research proposes a new federated learning framework, Lightweight FL, to resolve this problem by enhancing the current fundamental processes. First, a local network comprising numerous lightweight training methodologies is designed to lower the costs of local model training via small-scale convolution calculation. Second, non-structural pruning and fine-tuning of the local model is performed on this premise to reduce computation costs by reducing network complexity. Third, the optimal selection strategy is proposed during model pruning and model aggregation processes, and the model with the best performance is chosen as the benchmark model for the next iteration of learning. This strategy is equipped to reduce communication costs and improve learning efficiency of the framework. It is proved through verification of the bearing, gearbox, and bogie datasets that it can effectively decrease learning costs while still assuring good model performance. This offers a workable option for federated learning deployments in the future on low-performance edge devices.
科研通智能强力驱动
Strongly Powered by AbleSci AI