差别隐私
计算机科学
数据挖掘
高斯分布
班级(哲学)
高斯噪声
功率消耗
噪音(视频)
信息隐私
算法
功率(物理)
人工智能
计算机安全
量子力学
图像(数学)
物理
作者
Raksha Ramakrishna,Anna Scaglione,Tong Wu,Nikhil Ravi,Sean Peisert
标识
DOI:10.1109/tifs.2023.3289128
摘要
In this paper, we present a notion of differential privacy (DP) for data that comes from different classes. Here, the class-membership is private information that needs to be protected. The proposed method is an output perturbation mechanism that adds noise to the release of query response such that the analyst is unable to infer the underlying class-label. The proposed DP method is capable of not only protecting the privacy of class-based data but also meets quality metrics of accuracy and is computationally efficient and practical. We illustrate the efficacy of the proposed method empirically while outperforming the baseline additive Gaussian noise mechanism.We also examine a real-world application and apply the proposed DP method to the autoregression and moving average (ARMA) forecasting method, protecting the privacy of the underlying data source. Case studies on the real-world advanced metering infrastructure (AMI) measurements of household power consumption validate the excellent performance of the proposed DP method while also satisfying the accuracy of forecasted power consumption measurements.
科研通智能强力驱动
Strongly Powered by AbleSci AI