计算机科学
联合学习
机器学习
蒸馏
人工智能
领域(数学分析)
数据挖掘
信息隐私
计算机安全
数学
数学分析
有机化学
化学
作者
Xuan Gong,Abhishek Sharma,Srikrishna Karanam,Ziyan Wu,Terrence Chen,David Doermann,Innanje, Arun
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2022-06-28
卷期号:36 (11): 11891-11899
被引量:2
标识
DOI:10.1609/aaai.v36i11.21446
摘要
Federated Learning (FL) is a machine learning paradigm where local nodes collaboratively train a central model while the training data remains decentralized. Existing FL methods typically share model parameters or employ co-distillation to address the issue of unbalanced data distribution. However, they suffer from communication bottlenecks. More importantly, they risk privacy leakage risk. In this work, we develop a privacy preserving and communication efficient method in a FL framework with one-shot offline knowledge distillation using unlabeled, cross-domain, non-sensitive public data. We propose a quantized and noisy ensemble of local predictions from completely trained local models for stronger privacy guarantees without sacrificing accuracy. Based on extensive experiments on image classification and text classification tasks, we show that our method outperforms baseline FL algorithms with superior performance in both accuracy and data privacy preservation.
科研通智能强力驱动
Strongly Powered by AbleSci AI