计算机科学
个性化
领域(数学)
光学(聚焦)
服务器端
服务器
联合学习
数据库
万维网
分布式计算
物理
数学
纯数学
光学
作者
Jaehun Song,Min-hwan Oh,Hyung‐Sin Kim
出处
期刊:IEEE Access
[Institute of Electrical and Electronics Engineers]
日期:2022-01-01
卷期号:10: 120245-120255
被引量:3
标识
DOI:10.1109/access.2022.3221401
摘要
Personalized Federated Learning (FL) is an emerging research field in FL that learns an easily adaptable global model in the presence of data heterogeneity among clients. However, one of the main challenges for personalized FL is the heavy reliance on clients’ computing resources to calculate higher-order gradients since client data is segregated from the server to ensure privacy. To resolve this, we focus on a problem setting where the server may possess data independent of clients’ data – a prevalent problem setting in various applications, yet relatively unexplored in the existing literature. Specifically, we propose FedSIM, a new method for personalized FL that actively utilizes such server data to improve meta-gradient calculation in the server for increased personalization performance. Experimentally, we demonstrate through various benchmarks and ablations that FedSIM is superior to existing methods in terms of accuracy, more computationally efficient by calculating the full meta-gradients in the server, and converges up to 34.2% faster.
科研通智能强力驱动
Strongly Powered by AbleSci AI