计算机科学
差别隐私
贝叶斯优化
信息敏感性
贝叶斯概率
噪音(视频)
数据挖掘
最优化问题
功能(生物学)
数学优化
算法
计算机安全
人工智能
数学
进化生物学
生物
图像(数学)
作者
Qiqi Liu,Yuping Yan,Yaochu Jin
标识
DOI:10.1016/j.ins.2023.119739
摘要
Conventional Bayesian optimization approaches assume that all available data are located on one device, which does not consider privacy concerns since data storage and transmission may pose threats to data security. Existing differential privacy-based approaches can protect sensitive information by adding well-calibrated noise to the real objective value of the query input, which may seriously degrade the performance of Bayesian optimization. To address this issue, we propose to learn the noise level of each solution instead of the newly infilled solutions by optimizing a utility-privacy function that considers obfuscating the information of the current best solution, and striking a balance between exploration and exploitation. In this way, the real objective values and the current best solution will be protected. We further extend the proposed approach to a federated setting by considering multiple clients. Our experimental results show that the proposed algorithm can achieve very competitive optimization performance on ten test functions while being able to preserve data privacy. In addition, at the lowest level of privacy protection, the current best solution is leaked in less than 5 out of 91 rounds of surrogate updates for the proposed algorithm, which is significantly smaller than that of the algorithm under comparison.
科研通智能强力驱动
Strongly Powered by AbleSci AI