计算机科学
分歧(语言学)
趋同(经济学)
任务(项目管理)
联合学习
方案(数学)
差异(会计)
适应性学习
简单(哲学)
点(几何)
机器学习
人工智能
算法
数学
哲学
业务
数学分析
经济
管理
会计
认识论
经济增长
语言学
几何学
作者
Xuecheng Wu,Feihu Huang,Zhengmian Hu,Heng Huang
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence
[Association for the Advancement of Artificial Intelligence (AAAI)]
日期:2023-06-26
卷期号:37 (9): 10379-10387
被引量:2
标识
DOI:10.1609/aaai.v37i9.26235
摘要
Federated learning has attracted increasing attention with the emergence of distributed data. While extensive federated learning algorithms have been proposed for the non-convex distributed problem, the federated learning in practice still faces numerous challenges, such as the large training iterations to converge since the sizes of models and datasets keep increasing, and the lack of adaptivity by SGD-based model updates. Meanwhile, the study of adaptive methods in federated learning is scarce and existing works either lack a complete theoretical convergence guarantee or have slow sample complexity. In this paper, we propose an efficient adaptive algorithm (i.e., FAFED) based on the momentum-based variance reduced technique in cross-silo FL. We first explore how to design the adaptive algorithm in the FL setting. By providing a counter-example, we prove that a simple combination of FL and adaptive methods could lead to divergence. More importantly, we provide a convergence analysis for our method and prove that our algorithm is the first adaptive FL algorithm to reach the best-known samples O(epsilon(-3)) and O(epsilon(-2)) communication rounds to find an epsilon-stationary point without large batches. The experimental results on the language modeling task and image classification task with heterogeneous data demonstrate the efficiency of our algorithms.
科研通智能强力驱动
Strongly Powered by AbleSci AI