Adaptive Block-Wise Regularization and Knowledge Distillation for Enhancing Federated Learning

计算机科学 块(置换群论) 正规化(语言学) 人工智能 启发式 机器学习 人工神经网络 理论计算机科学 数学 几何学
作者
Jianchun Liu,Qingmin Zeng,Hongli Xu,Hongli Xu,Zhiyuan Wang,He Huang
出处
期刊:IEEE ACM Transactions on Networking [Institute of Electrical and Electronics Engineers]
卷期号:: 1-15
标识
DOI:10.1109/tnet.2023.3301972
摘要

Federated Learning (FL) is a distributed model training framework that allows multiple clients to collaborate on training a global model without disclosing their local data in edge computing (EC) environments. However, FL usually faces statistical heterogeneity (e.g., non-IID data) and system heterogeneity (e.g., computing and communication capabilities), resulting in poor model training performance. To deal with the above two challenges, we propose an efficient FL framework, named FedBR , which integrates the idea of block-wise regularization and knowledge distillation (KD) into the pioneering FL algorithm FedAvg , for resource-constrained edge computing. Specifically, we first divide the model into multiple blocks according to the layer order of deep neural network (DNN). The server only sends some consecutive model blocks instead of an entire model to clients for communication efficiency. Then, the clients make use of knowledge distillation to absorb the knowledge of global model blocks to alleviate statistical heterogeneity during local training. We provide a theoretical convergence guarantee for FedBR and show that the convergence bound will decrease as the increasing number of model blocks sent by the server. Besides, since the increasing number of model blocks brings more computing and communication costs, we design a heuristic algorithm (GMBS) to determine the appropriate number of model blocks for clients according to their varied data distributions, computing, and communication capabilities. Extensive experimental results show that FedBR can reduce the bandwidth consumption by about 31%, and achieve an average accuracy improvement of around 5.6% compared with the baselines under heterogeneous settings.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
香蕉觅云应助啊什么都行采纳,获得10
2秒前
2秒前
3秒前
4秒前
研友_VZG7GZ应助WWWWWMMMMMFFFFF采纳,获得10
5秒前
可可发布了新的文献求助10
5秒前
5秒前
xiazhq发布了新的文献求助10
5秒前
勤劳的狗完成签到,获得积分10
5秒前
6秒前
太子长琴发布了新的文献求助10
6秒前
简称王完成签到 ,获得积分10
7秒前
7秒前
7秒前
曾婉娟发布了新的文献求助10
8秒前
9秒前
贝壳发布了新的文献求助10
9秒前
何必完成签到,获得积分10
10秒前
10秒前
10秒前
郝家伙完成签到,获得积分20
11秒前
敏玥发布了新的文献求助10
12秒前
科研小趴菜完成签到,获得积分10
12秒前
虚幻的雪巧完成签到,获得积分10
12秒前
神仙渔发布了新的文献求助10
12秒前
传奇3应助合适小刺猬采纳,获得10
13秒前
yun发布了新的文献求助10
13秒前
13秒前
13秒前
14秒前
14秒前
Akim应助包容的琦采纳,获得10
14秒前
15秒前
阿大呆呆应助qazcy采纳,获得30
16秒前
16秒前
卡布斯发布了新的文献求助10
16秒前
今后应助敏玥采纳,获得10
17秒前
17秒前
简洁应助悲伤纳米粒采纳,获得10
18秒前
多喝水我发布了新的文献求助10
18秒前
高分求助中
Sustainable Land Management: Strategies to Cope with the Marginalisation of Agriculture 1000
Corrosion and Oxygen Control 600
Yaws' Handbook of Antoine coefficients for vapor pressure 500
Python Programming for Linguistics and Digital Humanities: Applications for Text-Focused Fields 500
行動データの計算論モデリング 強化学習モデルを例として 500
Johann Gottlieb Fichte: Die späten wissenschaftlichen Vorlesungen / IV,1: ›Transzendentale Logik I (1812)‹ 400
The role of families in providing long term care to the frail and chronically ill elderly living in the community 380
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2554496
求助须知:如何正确求助?哪些是违规求助? 2179230
关于积分的说明 5618187
捐赠科研通 1900427
什么是DOI,文献DOI怎么找? 949081
版权声明 565556
科研通“疑难数据库(出版商)”最低求助积分说明 504561