计算机科学
联合学习
多样性(控制论)
聚类分析
学习迁移
航程(航空)
传输(计算)
机器学习
人工智能
数据挖掘
分布(数学)
工程类
数学
数学分析
航空航天工程
并行计算
作者
Wenxuan Bao,Haohan Wang,Jun Wu,Jingrui He
出处
期刊:Cornell University - arXiv
日期:2023-06-10
被引量:2
标识
DOI:10.48550/arxiv.2306.06508
摘要
In federated learning (FL), multiple clients collaborate to train machine learning models together while keeping their data decentralized. Through utilizing more training data, FL suffers from the potential negative transfer problem: the global FL model may even perform worse than the models trained with local data only. In this paper, we propose FedCollab, a novel FL framework that alleviates negative transfer by clustering clients into non-overlapping coalitions based on their distribution distances and data quantities. As a result, each client only collaborates with the clients having similar data distributions, and tends to collaborate with more clients when it has less data. We evaluate our framework with a variety of datasets, models, and types of non-IIDness. Our results demonstrate that FedCollab effectively mitigates negative transfer across a wide range of FL algorithms and consistently outperforms other clustered FL algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI