FedViT: Federated continual learning of vision transformer at edge

计算机科学 可扩展性 边缘设备 人工智能 遗忘 卷积神经网络 变压器 任务(项目管理) 提取器 机器学习 GSM演进的增强数据速率 数据库 电压 云计算 语言学 哲学 物理 管理 量子力学 工艺工程 工程类 经济 操作系统
作者
Xiaojiang Zuo,Yaxin Luopan,Rui Han,Qinglong Zhang,Chi Harold Liu,Guoren Wang,Lydia Y. Chen
出处
期刊:Future Generation Computer Systems [Elsevier BV]
卷期号:154: 1-15 被引量:4
标识
DOI:10.1016/j.future.2023.11.038
摘要

Deep Neural Networks (DNNs) have been ubiquitously adopted in internet of things and are becoming an integral part of our daily life. When tackling the evolving learning tasks in real world, such as classifying different types of objects, DNNs face the challenge to continually retrain themselves according to the tasks on different edge devices. Federated continual learning (FCL) is a promising technique that offers partial solutions but yet to overcome the following difficulties: the significant accuracy loss due to the limited on-device processing, the negative knowledge transfer caused by the limited communication of non-IID (non-Independent and Identically Distributed) data, and the limited scalability on the tasks and edge devices. Moreover, existing FCL techniques are designed for convolutional neural networks (CNNs), which have not utilized the full potential of newly emerged powerful vision transformers (ViTs). Considering ViTs depend heavily on training data diversity and volume, we hypothesize ViTs are well-suited for FCL where data arrives continually. In this paper, we propose FedViT, an accurate and scalable federated continual learning framework for ViT models, via a novel concept of signature task knowledge. FedViT is a client-side solution that continuously extracts and integrates the knowledge of signature tasks which are highly influenced by the current task. Each client of FedViT is composed of a knowledge extractor, a gradient restorer and, most importantly, a gradient integrator. Upon training for a new task, the gradient integrator ensures the prevention of catastrophic forgetting and mitigation of negative knowledge transfer by effectively combining signature tasks identified from the past local tasks and other clients’ current tasks through the global model. We implement FedViT in PyTorch and extensively evaluate it against state-of-the-art techniques using popular federated continual learning benchmarks. Extensive evaluation results on heterogeneous edge devices show that FedViT improves model accuracy by 88.61% without increasing model training time, reduces communication cost by 61.55%, and achieves more improvements under difficult scenarios such as large numbers of tasks or clients, and training different complex ViT models.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
lhy12345完成签到 ,获得积分10
1秒前
1秒前
1秒前
六个核桃完成签到,获得积分10
2秒前
princip关注了科研通微信公众号
3秒前
5秒前
5秒前
戴衡霞发布了新的文献求助30
8秒前
9秒前
裹着被子开空调完成签到,获得积分10
9秒前
11秒前
科研通AI5应助不安梦桃采纳,获得10
12秒前
哈哈哈发布了新的文献求助30
14秒前
田様应助mir为少采纳,获得10
14秒前
16秒前
开朗的夜山完成签到,获得积分10
19秒前
swing发布了新的文献求助10
20秒前
21秒前
qing晴发布了新的文献求助10
21秒前
princip发布了新的文献求助10
21秒前
丘比特应助王讯采纳,获得10
22秒前
23秒前
海鑫王发布了新的文献求助10
24秒前
礼礼完成签到 ,获得积分10
26秒前
欣慰白山发布了新的文献求助10
27秒前
daw发布了新的文献求助10
27秒前
粗犷的灵松完成签到 ,获得积分10
28秒前
28秒前
guozizi发布了新的文献求助30
29秒前
搞怪不斜完成签到,获得积分10
30秒前
tartag发布了新的文献求助10
32秒前
32秒前
32秒前
MEST发布了新的文献求助30
32秒前
礼礼关注了科研通微信公众号
32秒前
35秒前
35秒前
36秒前
威武红酒完成签到 ,获得积分10
36秒前
37秒前
高分求助中
【重要!!请各位用户详细阅读此贴】科研通的精品贴汇总(请勿应助) 10000
Plutonium Handbook 1000
Three plays : drama 1000
International Code of Nomenclature for algae, fungi, and plants (Madrid Code) (Regnum Vegetabile) 1000
Semantics for Latin: An Introduction 999
Ultra-Wide Bandgap Semiconductor Materials 600
Psychology Applied to Teaching 14th Edition 600
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 4090512
求助须知:如何正确求助?哪些是违规求助? 3629143
关于积分的说明 11505671
捐赠科研通 3341176
什么是DOI,文献DOI怎么找? 1836634
邀请新用户注册赠送积分活动 904578
科研通“疑难数据库(出版商)”最低求助积分说明 822421