计算机科学
大数据
信息隐私
分析
数据科学
联合学习
云计算
差别隐私
物联网
预测分析
数据分析
作者
Jiale Zhang,Bing Chen,Shui Yu,Hai Deng
出处
期刊:Global Communications Conference
日期:2019-12-01
卷期号:: 1-6
被引量:12
标识
DOI:10.1109/globecom38437.2019.9014272
摘要
Federated learning has emerged as a promising solution for big data analytics, which jointly trains a global model across multiple mobile devices. However, participants' sensitive data information may be leaked to an untrusted server through uploaded gradient vectors. To address this problem, we propose a privacy-enhanced federated learning (PEFL) scheme to protect the gradients over an untrusted server. This is mainly enabled by encrypting participants' local gradients with Paillier homomorphic cryptosystem. In order to reduce the computation costs of the cryptosystem, we utilize the distributed selective stochastic gradient descent (DSSGD) method in the local training phase to achieve the distributed encryption. Moreover, the encrypted gradients can be further used for secure sum aggregation at the server side. In this way, the untrusted server can only learn the aggregated statistics for all the participants' updates, while each individual's private information will be well-protected. For the security analysis, we theoretically prove that our scheme is secure under several cryptographic hard problems. Exhaustive experimental results demonstrate that PEFL has low computation costs while reaching high accuracy in the settings of federated learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI