聚类分析
计算机科学
共识聚类
数据挖掘
分歧(语言学)
星团(航天器)
相关聚类
CURE数据聚类算法
机器学习
人工智能
语言学
哲学
程序设计语言
作者
Yihan Yan,Xiaojun Tong,Shen Wang
标识
DOI:10.1109/tnnls.2023.3264740
摘要
Federated learning (FL) is a distributed machine learning framework that allows resource-constrained clients to train a global model jointly without compromising data privacy. Although FL is widely adopted, high degrees of systems and statistical heterogeneity are still two main challenges, which leads to potential divergence and nonconvergence. Clustered FL handles the problem of statistical heterogeneity straightly by discovering the geometric structure of clients with various data generation distributions and getting multiple global models. The number of clusters contains prior knowledge about the clustering structure and has a significant impact on the performance of clustered FL methods. Existing clustered FL methods are inadequate for adaptively inferring the optimal number of clusters in environments with high systems' heterogeneity. To address this issue, we propose an iterative clustered FL (ICFL) framework in which the server dynamically discovers the clustering structure by successively performing incremental clustering and clustering in one iteration. We focus on the average connectivity within each cluster and give incremental clustering and clustering methods that are compatible with ICFL based on mathematical analysis. We evaluate ICFL in experiments on high degrees of systems and statistical heterogeneity, multiple datasets, and convex and nonconvex objectives. Experimental results verify our theoretical analysis and show that ICFL outperforms several clustered FL baseline methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI