计算机科学
噪音(视频)
蒸馏
人工智能
化学
色谱法
图像(数学)
作者
Liang Gao,Li Li,Yingwen Chen,Shaojing Fu,D. Wang,Siwei Wang,Chengzhong Xu,Ming Xu
标识
DOI:10.1109/tnnls.2025.3546903
摘要
Federated learning (FL) is a new learning paradigm that enables multiple clients to collaboratively train a high-performance model while preserving user privacy. However, the effectiveness of FL heavily relies on the availability of accurately labeled data, which can be challenging to obtain in real-world scenarios. To address this issue and robustly train shared models using distributed noisy labeled data, we propose FedDQ, a noise-robust FL framework that utilizes co-distillation and quality-aware aggregation techniques. FedDQ incorporates two key features: a noise-adaptive training strategy and an efficient label-correcting mechanism. The noise-adaptive training strategy relies on the estimation of labels' noise levels to dynamically adjust clients' training engagement, which mitigates the impact of wrong labels while efficiently exploring features from clean data. In addition, FedDQ designs a two-head network and employs it for co-distillation. The co-distillation strategy facilitates knowledge transfer among clients to share the representational capabilities. Besides, FedDQ enhances label correction to rectify improper labels through co-filtering and label correction. The experimental results demonstrate the effectiveness of FedDQ in improving model performance and handling noisy data challenges in FL settings. On the CIFAR-100 dataset with noisy labels, FedDQ exhibits a notable improvement of up to 32.4% compared to the baseline method.
科研通智能强力驱动
Strongly Powered by AbleSci AI