计算机科学
杠杆(统计)
边缘设备
人工智能
GSM演进的增强数据速率
机器学习
独立同分布随机变量
分布式计算
数据挖掘
操作系统
随机变量
数学
云计算
统计
作者
Jianqi Liu,Zhiwei Zhao,Xiangyang Luo,Pan Li,Geyong Min,Huiyong Li
标识
DOI:10.1109/tmc.2024.3397585
摘要
Federated Learning (FL) has been widely used to facilitate distributed and privacy-preserving machine learning in recent years. Different from centralized training that usually has independent and identically distributed (IID) distribution of all users' data, FL suffers from significant communication cost and model performance degradation due to the non-IID data from individual edge devices. Existing work calibrates the local models using a global anchor or sharing global data. However, these studies either assume that the central server has the global dataset or require participating devices to share raw data, which incurs additional communication costs and privacy concerns. In this paper, we propose SlaugFL , a novel selective GAN-based data augmentation scheme for communication-efficient edge FL, which selects representative devices to share specific local class prototypes with the central server for GAN model training and improves FL performance with the trained GAN. Specifically, on the server side, we generate diverse labeled candidate data with the help of powerful generative models (the stable diffusion model and ChatGPT). To ensure that the GAN-generated data possesses a similar domain to the devices' local data, we leverage these selected local class prototypes to pick desired GAN training samples from the labeled candidate data. On the device side, we propose a dual-calibration approach consisting of two calibration manners. Concretely, we augment devices' non-IID data with the trained GAN model, where devices utilize the trained GAN model to generate the IID dataset. Thus, the device's local model can be directly calibrated with the augmented data. With the generated IID data, we yield privacy-free (p-f) global class prototypes which can be employed to further calibrate devices' local models. Combining these two calibrations effectively improves devices' local models. Extensive experimental results show that SlaugFL can significantly reduce the communication cost (up to 52.49%) while achieving the same accuracy, compared to the state-of-the-art work.
科研通智能强力驱动
Strongly Powered by AbleSci AI