计算机科学
趋同(经济学)
联合学习
收敛速度
资源(消歧)
钥匙(锁)
分布式计算
数学优化
人工智能
数学
计算机网络
计算机安全
经济增长
经济
作者
Yangyang Wang,Xiao Zhang,Mingyi Li,Tian Lan,Huashan Chen,Hui Xiong,Xiuzhen Cheng,Dongxiao Yu
标识
DOI:10.1145/3580305.3599521
摘要
In this paper, we propose an adaptive learning paradigm for resource-constrained cross-device federated learning, in which heterogeneous local submodels with varying resources can be jointly trained to produce a global model. Different from existing studies, the submodel structures of different clients are formed by arbitrarily assigned neurons according to their local resources. Along this line, we first design a general resource-adaptive federated learning algorithm, namely RA-Fed, and rigorously prove its convergence with asymptotically optimal rate O(1/√Γ*TQ) under loose assumptions. Furthermore, to address both submodels heterogeneity and data heterogeneity challenges under non-uniform training, we come up with a new server aggregation mechanism RAM-Fed with the same theoretically proved convergence rate. Moreover, we shed light on several key factors impacting convergence, such as minimum coverage rate, data heterogeneity level, submodel induced noises. Finally, we conduct extensive experiments on two types of tasks with three widely used datasets under different experimental settings. Compared with the state-of-the-arts, our methods improve the accuracy up to 10% on average. Particularly, when submodels jointly train with 50% parameters, RAM-Fed achieves comparable accuracy to FedAvg trained with the full model.
科研通智能强力驱动
Strongly Powered by AbleSci AI