一般化
计算机科学
语义学(计算机科学)
个性化
班级(哲学)
人工智能
机器学习
钥匙(锁)
空格(标点符号)
理论(学习稳定性)
万维网
数学
程序设计语言
计算机安全
操作系统
数学分析
作者
Yutong Dai,Zeyuan Chen,Junnan Li,Shelby Heinecke,Lichao Sun,Ran Xu
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2023-06-26
卷期号:37 (6): 7314-7322
被引量:24
标识
DOI:10.1609/aaai.v37i6.25891
摘要
Data heterogeneity across clients in federated learning (FL) settings is a widely acknowledged challenge. In response, personalized federated learning (PFL) emerged as a framework to curate local models for clients' tasks. In PFL, a common strategy is to develop local and global models jointly - the global model (for generalization) informs the local models, and the local models (for personalization) are aggregated to update the global model. A key observation is that if we can improve the generalization ability of local models, then we can improve the generalization of global models, which in turn builds better personalized models. In this work, we consider class imbalance, an overlooked type of data heterogeneity, in the classification setting. We propose FedNH, a novel method that improves the local models' performance for both personalization and generalization by combining the uniformity and semantics of class prototypes. FedNH initially distributes class prototypes uniformly in the latent space and smoothly infuses the class semantics into class prototypes. We show that imposing uniformity helps to combat prototype collapse while infusing class semantics improves local models. Extensive experiments were conducted on popular classification datasets under the cross-device setting. Our results demonstrate the effectiveness and stability of our method over recent works.
科研通智能强力驱动
Strongly Powered by AbleSci AI