MNIST数据库
计算机科学
差别隐私
拉普拉斯变换
差速器(机械装置)
趋同(经济学)
集合(抽象数据类型)
人工智能
灵敏度(控制系统)
联合学习
机器学习
数学优化
数据挖掘
深度学习
数学
电子工程
数学分析
经济增长
工程类
经济
程序设计语言
航空航天工程
作者
Yipeng Zhou,Runze Wang,Jiahao Liu,Di Wu,Shui Yu,Yonggang Wen
出处
期刊:IEEE Transactions on Dependable and Secure Computing
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:: 1-15
标识
DOI:10.1109/tdsc.2023.3325889
摘要
Although Federated Learning (FL) prevents the exposure of original data samples when collaboratively training machine learning models among decentralized clients, it has been revealed that vanilla FL is still susceptible to adversarial attacks if model parameters are leaked to malicious attackers. To enhance the protection level of FL, Differential Private Federated Learning (DPFL) has been proposed in recent years. DPFL injects zero-mean noises randomly generated by differential private (DP) mechanisms on local model parameters before they are disclosed. Nevertheless, DP noises can significantly deteriorate model utility jeopardizing the practicality of DPFL. In this paper, we are among the first to explore how to improve the model utility of DPFL by tuning the number of local iterations (LIs) on DPFL clients. Our work shows that such a local iteration tuning approach can well mitigate the adverse influence of DP noises on the final model utility. Formally, we derive the sensitivity (a measure of the maximum change of the output given two adjacent inputs) with respect to the number of LIs conducted on DPFL clients for the Laplace mechanism, and the aggregated variances of Laplace noises at the server side. We further conduct convergence rate analysis to quantify the influence of the Laplace noises on the final model accuracy and determine how to optimally set the number of LIs. Finally, to verify our theoretical findings, we perform extensive experiments using three real-world datasets, namely, Lending Club, MNIST and Fashion-MNIST. The results not only corroborate our analysis, but also demonstrate that our approach significantly improves the practicality of DPFL.
科研通智能强力驱动
Strongly Powered by AbleSci AI