计算机科学
差别隐私
随机梯度下降算法
传输(电信)
高斯分布
信息隐私
变量(数学)
数据挖掘
机器学习
计算机安全
电信
数学
数学分析
物理
量子力学
人工神经网络
作者
Muah Kim,Onur Günlü,Rafael F. Schaefer
标识
DOI:10.1109/icassp39728.2021.9413764
摘要
Federated learning (FL) allows to train a massive amount of data privately due to its decentralized structure. Stochastic gradient descent (SGD) is commonly used for FL due to its good empirical performance, but sensitive user information can still be inferred from weight updates shared during FL iterations. We consider Gaussian mechanisms to preserve local differential privacy (LDP) of user data in the FL model with SGD. The trade-offs between user privacy, global utility, and transmission rate are proved by defining appropriate metrics for FL with LDP. Compared to existing results, the query sensitivity used in LDP is defined as a variable, and a tighter privacy accounting method is applied. The proposed utility bound allows heterogeneous parameters over all users. Our bounds characterize how much utility decreases and transmission rate increases if a stronger privacy regime is targeted. Furthermore, given a target privacy level, our results guarantee a significantly larger utility and a smaller transmission rate as compared to existing privacy accounting methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI