计算机科学
选择(遗传算法)
联合学习
机器学习
人工智能
分布式计算
作者
Qingming Li,Xiaohang Li,Li Zhou,Xiaoran Yan
标识
DOI:10.1109/icassp48485.2024.10447356
摘要
Federated learning is a collaborative machine learning framework where multiple clients jointly train a global model. To mitigate communication overhead, it is common to select a subset of clients for participation in each training round. However, existing client selection strategies often rely on a fixed number of clients throughout all rounds, which may not be the optimal choice for balancing training efficiency and model performance. Moreover, these approaches typically evaluate clients solely based on their performances in one single round, neglecting the effects of historical records and potentially introducing randomness into the global model. In our work, we introduce AdaFL, a novel approach to client selection and contribution evaluation for efficient federated learning. AdaFL dynamically adjusts the number of clients to be selected using a piecewise function. It initiates with a small selection size to reduce communication overhead and progressively increases it to enhance model generalization. Furthermore, AdaFL evaluates clients' contributions by combining their performance metrics from both current and historical rounds through a weighted average function, with a weight parameter fine-tuning the trade-off between current and historical data. Experimental results show that the proposed AdaFL outperforms prior works in terms of improving test accuracy and reducing training runtime.
科研通智能强力驱动
Strongly Powered by AbleSci AI