计算机科学
云计算
信息隐私
互联网隐私
计算机安全
计算机网络
操作系统
作者
Jinhao Zhou,Zhou Su,Yuntao Wang,Yanghe Pan,Qianqian Pan,Lizheng Liu,Jun Wu
标识
DOI:10.1109/smartcloud62736.2024.00011
摘要
Federated graph attention networks (FGATs), blending federated learning (FL) with graph attention networks (GAT), present a novel paradigm for collaborative, privacy-conscious graph model training in the smart cloud. FGATs leverage distributed attention mechanisms to enhance graph feature prioritization, improving representation learning while preserving data decentralization. Despite their advancements, FGATs face privacy concerns, such as attribute inference. Our study proposes an efficient privacy-preserving FGAT (PFGAT). We devise an improved multiplication triplet (IMT)-based attention mechanism with a hybrid differential privacy (DP) approach. We invent a novel triplet generation method and a hybrid neighbor aggregation algorithm, specifically designed to respect the distinct traits of neighbor nodes, efficiently secures GAT node embeddings. Evaluations on benchmarks such as Cora, Citeseer, and Pubmed demonstrate PFGAT's ability to safeguard privacy without compromising on efficiency or performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI