计算机科学
机器学习
人工智能
分类器(UML)
联合学习
班级(哲学)
特征(语言学)
交叉熵
特征学习
一致性(知识库)
外部数据表示
熵(时间箭头)
数据挖掘
最大熵原理
量子力学
物理
哲学
语言学
作者
Hongyan Peng,Tongtong Wu,Zhenkui Shi,Xianxian Li
标识
DOI:10.1109/iscc58397.2023.10218040
摘要
Federated learning (FL) is a scheme that enables multiple participants to cooperate to train a high-performance machine learning model in a way that data cannot be exported. FL effectively protects the data privacy of all participants and reduces communication costs. However, a key challenge for federated learning is the data heterogeneity across clients. In addition, in real FL applications, the class distribution of data is usually unbalanced. Although many researches have been conducted to solve the problem of data heterogeneity, class imbalance problem usually arises along with the heterogeneity data, resulting in the poor performance of the global model. In this paper, a novel FL method (we call it FedEF) is designed for heterogeneous data and local class imbalance problem via optimize feature extractors and classifiers. FedEF optimizes the local feature extractor representation of individual clients through contrastive learning to maximize the consistency of the feature extractor representation trained by the local client and the central server to handle heterogeneous data. Meanwhile, we modified the cross entropy loss in the model, assigned different loss weights to different classes of data, paid more attention to the class with fewer samples in the training process, and corrected the biased classifier to alleviate the problem of class imbalance, thus can improve the performance of the global model. Experiments show that FedEF is an effective solution to FL model obtained under heterogeneous and local class imbalance.
科研通智能强力驱动
Strongly Powered by AbleSci AI