发电机(电路理论)
计算机科学
功率(物理)
量子力学
物理
作者
X. Wang,Tianqing Zhu,Wanlei Zhou
标识
DOI:10.1016/j.ins.2024.120437
摘要
Federated learning is a decentralized learning approach that shows promise for preserving users' privacy by avoiding local data sharing. However, the heterogeneous data in federated learning limits its applications in wider scopes. The data heterogeneity from diverse clients leads to weight divergence between local models and degrades the global performance of federated learning. To mitigate data heterogeneity, supplementing training data in federated learning has been explored and proven effective. However, traditional methods of supplementing data raise privacy concerns and increase learning costs. In this paper, we propose a solution to supplement training data with a generative model that is transparent to local clients. We keep the learning of the generative model on the server side and store the supplementary data from the generative model on the server side as well. This approach avoids collecting auxiliary data directly from local clients, reducing privacy concerns for them and preventing rising costs for local clients. To avoid loose learning on the real and synthetic samples, we constrain the optimization of the global model with a distance between the training global model and the distribution of the aggregated global model. Extensive experiments have verified that the synthetic data from the generative model improve the performance of federated learning, especially in a heterogeneous environment.
科研通智能强力驱动
Strongly Powered by AbleSci AI