计算机科学
图层(电子)
融合
分布式计算
人工智能
材料科学
纳米技术
语言学
哲学
作者
Jing Jin,Qing Wang,Xiaofeng Liu
标识
DOI:10.1109/iccc57788.2023.10233560
摘要
Federated Learning (FL) is a distributed machine learning framework that enables clients to train models using local datasets while preserving data privacy. However, FL faces challenges due to the complexity and diversity of data sources, computing resources, and client device capabilities, resulting in data and model heterogeneity. We propose an innovative FL framework called heterogeneous federated learning with cross-layer model fusion (HFedCMF) to address this. In this framework, clients upload their heterogeneous local models to the server, which then obtains a new global model using the cross-layer model fusion method that exploits the optimal transport properties. Additionally, we introduce a modified group lasso regularization in local training to separate the training of personalized models from the global fusion process, enhancing clients' ability to achieve superior personalized performance. Experiments on various datasets demonstrate that the proposed HFedCMF outperforms homogeneous and heterogeneous FL algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI