计算机科学
联合学习
推荐系统
推论
协同过滤
相似性(几何)
矩阵分解
特征(语言学)
趋同(经济学)
数据挖掘
人工智能
机器学习
情报检索
哲学
特征向量
经济
物理
图像(数学)
量子力学
经济增长
语言学
作者
Xuanang Ding,Guohui Li,Ling Yuan,Lu Zhang,Qian Rong
标识
DOI:10.1016/j.ipm.2023.103470
摘要
Previous federated recommender systems are based on traditional matrix factorization, which can improve personalized service but are vulnerable to gradient inference attacks. Most of them adopt model averaging to fit the data heterogeneity of federated recommender systems, requiring more training costs. To address privacy and efficiency, we propose an efficient federated item similarity model for the heterogeneous recommendation, called FedIS, which can train a global item-based collaborative filtering model to eliminate user feature dependencies. Specifically, we extend the neural item similarity model to the federated model, where each client only locally optimizes the shared item feature matrix. We then propose a fast-convergent federated aggregation method inspired by meta-learning to address heterogeneous user updates and accelerate the convergence of global training. Furthermore, we propose a two-stage perturbation method to protect both local training and transmission while reducing communication costs. Finally, extensive experiments on four real-world datasets validate that FedIS can provide more competitive performance on federated recommendations. Our proposed method also shows significant training efficiency with less performance degradation.
科研通智能强力驱动
Strongly Powered by AbleSci AI