个性化
计算机科学
卷积神经网络
插件
变压器
图层(电子)
联合学习
数据建模
机器学习
人工智能
学习迁移
数据库
万维网
操作系统
化学
物理
有机化学
量子力学
电压
作者
Guangyu Sun,Matías Mendieta,Jun Luo,Shandong Wu,Chen Chen
标识
DOI:10.1109/iccv51070.2023.00460
摘要
Personalized Federated Learning (PFL) represents a promising solution for decentralized learning in heterogeneous data environments. Partial model personalization has been proposed to improve the efficiency of PFL by selectively updating local model parameters instead of aggregating all of them. However, previous work on partial model personalization has mainly focused on Convolutional Neural Networks (CNNs), leaving a gap in understanding how it can be applied to other popular models such as Vision Transformers (ViTs). In this work, we investigate where and how to partially personalize a ViT model. Specifically, we empirically evaluate the sensitivity to data distribution of each type of layer. Based on the insights that the self-attention layer and the classification head are the most sensitive parts of a ViT, we propose a novel approach called FedPerfix, which leverages plugins to transfer information from the aggregated model to the local client as a personalization. Finally, we evaluate the proposed approach on CIFAR-100, OrganAMNIST, and Office-Home datasets and demonstrate its effectiveness in improving the model's performance compared to several advanced PFL methods. Code is available at https://github.com/imguangyu/FedPerfix
科研通智能强力驱动
Strongly Powered by AbleSci AI