弹性(材料科学)
可验证秘密共享
辍学(神经网络)
计算机安全
计算机科学
互联网隐私
材料科学
集合(抽象数据类型)
机器学习
复合材料
程序设计语言
作者
Deepti Saraswat,Manik Lal Das,Sudeep Tanwar
摘要
ABSTRACT Federated learning (FL) is a decentralized machine learning approach where multiple devices collaboratively train a global model without directly sharing their raw data. This method utilizes local computational resources while depending on a central server for coordination. Although FL enhances efficiency in edge computing, it is susceptible to adversarial attacks. A compromised aggregator can degrade model performance by introducing data or model poisoning. To address these risks, FL must safeguard the confidentiality and integrity of local updates while ensuring their authenticity before aggregation. This work introduces VeriProd , a verifiable, privacy‐preserving FL framework designed for practical applications, addressing the challenge of implementing secure aggregation at scale. The framework preserves user privacy while ensuring the integrity and verifiability of local gradients. The proposed aggregator encryption method securely masks users' local gradients, allowing the aggregator to aggregate them without revealing individual data. Simultaneously, users can verify the correctness of the aggregated results. Additionally, the framework incorporates a group management mechanism to handle user dropouts, ensuring the latter can seamlessly rejoin future learning rounds without disruption. Through comprehensive analysis and experimental evaluation, we demonstrate the framework's security, robustness, and efficiency. The results indicate that VeriProd achieves accuracy comparable to FedAvg while maintaining strong performance in communication and computation costs, outperforming well‐established privacy‐preserving verifiable FL schemes.
科研通智能强力驱动
Strongly Powered by AbleSci AI