FedTune: A Deep Dive into Efficient Federated Fine-Tuning with Pre-trained Transformers

计算机科学 变压器 人工智能 杠杆(统计) 稳健性(进化) 卷积神经网络 机器学习 深度学习 语言模型 联合学习 工程类 生物化学 基因 电气工程 电压 化学
作者
Jinyu Chen,Wenchao Xu,Song Guo,Junxiao Wang,Jie Zhang,Haozhao Wang
出处
期刊:Cornell University - arXiv 被引量:7
标识
DOI:10.48550/arxiv.2211.08025
摘要

Federated Learning (FL) is an emerging paradigm that enables distributed users to collaboratively and iteratively train machine learning models without sharing their private data. Motivated by the effectiveness and robustness of self-attention-based architectures, researchers are turning to using pre-trained Transformers (i.e., foundation models) instead of traditional convolutional neural networks in FL to leverage their excellent transfer learning capabilities. Despite recent progress, how pre-trained Transformer models play a role in FL remains obscure, that is, how to efficiently fine-tune these pre-trained models in FL and how FL users could benefit from this new paradigm. In this paper, we explore this issue and demonstrate that the fine-tuned Transformers achieve extraordinary performance on FL, and that the lightweight fine-tuning method facilitates a fast convergence rate and low communication costs. Concretely, we conduct a rigorous empirical study of three tuning methods (i.e., modifying the input, adding extra modules, and adjusting the backbone) using two types of pre-trained models (i.e., vision-language models and vision models) for FL. Our experiments show that 1) Fine-tuning the bias term of the backbone performs best when relying on a strong pre-trained model; 2) The vision-language model (e.g., CLIP) outperforms the pure vision model (e.g., ViT) and is more robust to the few-shot settings; 3) Compared to pure local training, FL with pre-trained models has a higher accuracy because it alleviates the problem of over-fitting. We will release our code and encourage further exploration of pre-trained Transformers and FL.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
刚刚
潸潸发布了新的文献求助30
刚刚
刚刚
受伤的无敌完成签到,获得积分10
刚刚
刚刚
yiyiyi发布了新的文献求助10
刚刚
YYQ完成签到,获得积分20
1秒前
Tracy发布了新的文献求助10
1秒前
大大发布了新的文献求助10
2秒前
Survivor应助科研通管家采纳,获得10
2秒前
2秒前
若雨凌风应助科研通管家采纳,获得10
2秒前
黄小邪完成签到,获得积分10
3秒前
3秒前
爆米花应助科研通管家采纳,获得30
3秒前
锦城纯契发布了新的文献求助10
3秒前
苗条半雪完成签到,获得积分20
3秒前
3秒前
乐观小之应助科研通管家采纳,获得10
3秒前
所所应助科研通管家采纳,获得10
3秒前
Orange应助科研通管家采纳,获得10
3秒前
SYLH应助科研通管家采纳,获得10
3秒前
3秒前
SYLH应助科研通管家采纳,获得10
3秒前
赘婿应助科研通管家采纳,获得10
3秒前
JamesPei应助科研通管家采纳,获得10
3秒前
4秒前
甜甜诗筠发布了新的文献求助10
4秒前
专注白昼应助科研通管家采纳,获得10
4秒前
4秒前
4秒前
4秒前
4秒前
4秒前
4秒前
4秒前
hao完成签到,获得积分10
4秒前
myth发布了新的文献求助10
4秒前
5秒前
高分求助中
ФОРМИРОВАНИЕ АО "МЕЖДУНАРОДНАЯ КНИГА" КАК ВАЖНЕЙШЕЙ СИСТЕМЫ ОТЕЧЕСТВЕННОГО КНИГОРАСПРОСТРАНЕНИЯ 3000
Les Mantodea de Guyane: Insecta, Polyneoptera [The Mantids of French Guiana] 2500
Future Approaches to Electrochemical Sensing of Neurotransmitters 1000
Electron microscopy study of magnesium hydride (MgH2) for Hydrogen Storage 1000
Finite Groups: An Introduction 800
Research on WLAN scenario optimisation policy based on IoT smart campus 500
生物降解型栓塞微球市场(按产品类型、应用和最终用户)- 2030 年全球预测 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3905663
求助须知:如何正确求助?哪些是违规求助? 3450883
关于积分的说明 10862852
捐赠科研通 3176286
什么是DOI,文献DOI怎么找? 1754787
邀请新用户注册赠送积分活动 848456
科研通“疑难数据库(出版商)”最低求助积分说明 791027