计算机科学
差别隐私
随机梯度下降算法
加密
架空(工程)
随机优化
混淆
最优化问题
信息隐私
水准点(测量)
数学优化
人工智能
数据挖掘
算法
计算机安全
数学
人工神经网络
操作系统
地理
大地测量学
作者
Yongqiang Wang,H. Vincent Poor
标识
DOI:10.1109/tac.2022.3174187
摘要
Decentralized stochastic optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing. Since involved data usually contain sensitive information like user locations, healthcare records, and financial transactions, privacy protection has become an increasingly pressing need in the implementation of decentralized stochastic optimization algorithms. In this article, we propose a decentralized stochastic gradient descent (SGD) algorithm, which is embedded with inherent privacy protection for every participating agent against other participating agents and external eavesdroppers. This proposed algorithm builds in a dynamics based gradient-obfuscation mechanism to enable privacy protection without compromising optimization accuracy, which is in significant difference from differential-privacy based privacy solutions for decentralized optimization that have to trade optimization accuracy for privacy. The dynamics based privacy approach is encryption-free, and hence avoids incurring heavy communication or computation overhead, which is a common problem with encryption based privacy solutions for decentralized stochastic optimization. Besides rigorously characterizing the convergence performance of the proposed decentralized SGD algorithm under both convex objective functions and nonconvex objective functions, we also provide rigorous information-theoretic analysis of its strength of privacy protection. Simulation results for a distributed estimation problem as well as numerical experiments for decentralized learning on a benchmark machine learning dataset confirm the effectiveness of the proposed approach.
科研通智能强力驱动
Strongly Powered by AbleSci AI