FedICT: Federated Multi-Task Distillation for Multi-Access Edge Computing

计算机科学 架空(工程) 任务(项目管理) GSM演进的增强数据速率 机器学习 服务器 蒸馏 分布式计算 人工智能 计算机网络 操作系统 化学 管理 有机化学 经济
作者
Zhiyuan Wu,Sheng Sun,Yuwei Wang,Min Liu,Quyang Pan,Xuefeng Jiang,Bo Gao
出处
期刊:IEEE Transactions on Parallel and Distributed Systems [Institute of Electrical and Electronics Engineers]
卷期号:35 (6): 1107-1121 被引量:24
标识
DOI:10.1109/tpds.2023.3289444
摘要

The growing interest in intelligent services and privacy protection for mobile devices has given rise to the widespread application of federated learning in Multi-access Edge Computing (MEC). Diverse user behaviors call for personalized services with heterogeneous Machine Learning (ML) models on different devices. Federated Multi-task Learning (FMTL) is proposed to train related but personalized ML models for different devices, whereas previous works suffer from excessive communication overhead during training and neglect the model heterogeneity among devices in MEC. Introducing knowledge distillation into FMTL can simultaneously enable efficient communication and model heterogeneity among clients, whereas existing methods rely on a public dataset, which is impractical in reality. To tackle this dilemma, Federated MultI-task Distillation for Multi-access Edge CompuTing (FedICT) is proposed. FedICT direct local-global knowledge aloof during bi-directional distillation processes between clients and the server, aiming to enable multi-task clients while alleviating client drift derived from divergent optimization directions of client-side local models. Specifically, FedICT includes Federated Prior Knowledge Distillation (FPKD) and Local Knowledge Adjustment (LKA). FPKD is proposed to reinforce the clients' fitting of local data by introducing prior knowledge of local data distributions. Moreover, LKA is proposed to correct the distillation loss of the server, making the transferred local knowledge better match the generalized representation. Experiments on three datasets show that FedICT significantly outperforms all compared benchmarks in various data heterogeneous and model architecture settings, achieving improved accuracy with less than 1.2% training communication overhead compared with FedAvg and no more than 75% training communication round compared with FedGKT.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Shumin Wang发布了新的文献求助10
3秒前
zzm发布了新的文献求助10
3秒前
kk发布了新的文献求助10
4秒前
独行业完成签到,获得积分10
7秒前
7秒前
8秒前
9秒前
赵西里完成签到,获得积分10
10秒前
橙子发布了新的文献求助30
11秒前
张文涛发布了新的文献求助10
12秒前
李浩发布了新的文献求助10
12秒前
ding应助12345采纳,获得10
13秒前
量子星尘发布了新的文献求助10
14秒前
kenna123完成签到,获得积分10
14秒前
14秒前
奋斗的萝发布了新的文献求助10
14秒前
15秒前
窦旭旭发布了新的文献求助10
15秒前
zzm完成签到,获得积分10
15秒前
77发布了新的文献求助10
18秒前
kenna123发布了新的文献求助10
19秒前
小杨完成签到,获得积分10
21秒前
zhangdong发布了新的文献求助30
22秒前
Julien完成签到,获得积分10
23秒前
要减肥采蓝完成签到,获得积分10
26秒前
量子星尘发布了新的文献求助10
27秒前
吃饭了吗123完成签到,获得积分20
30秒前
桐桐应助奋斗的萝采纳,获得10
30秒前
fd163c应助新手菜鸟采纳,获得10
35秒前
35秒前
cc4ever发布了新的文献求助10
39秒前
39秒前
12345发布了新的文献求助10
39秒前
河南彭于晏完成签到,获得积分10
41秒前
mi发布了新的文献求助10
41秒前
42秒前
沈醒会完成签到,获得积分10
46秒前
47秒前
阿楠发布了新的文献求助10
48秒前
缓慢天菱应助过时的乐天采纳,获得10
49秒前
高分求助中
(禁止应助)【重要!!请各位详细阅读】【科研通的精品贴汇总】 10000
The Netter Collection of Medical Illustrations: Digestive System, Volume 9, Part III – Liver, Biliary Tract, and Pancreas, 3rd Edition 666
Social Epistemology: The Niches for Knowledge and Ignorance 500
优秀运动员运动寿命的人文社会学因素研究 500
Medicine and the Navy, 1200-1900: 1815-1900 420
Introducing Sociology Using the Stuff of Everyday Life 400
Conjugated Polymers: Synthesis & Design 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4247562
求助须知:如何正确求助?哪些是违规求助? 3780617
关于积分的说明 11869916
捐赠科研通 3433831
什么是DOI,文献DOI怎么找? 1884649
邀请新用户注册赠送积分活动 936234
科研通“疑难数据库(出版商)”最低求助积分说明 842144