计算机科学
异常检测
判别式
异常(物理)
集成学习
数据挖掘
人工智能
光学(聚焦)
机器学习
任务(项目管理)
编码(集合论)
工程类
物理
光学
凝聚态物理
系统工程
集合(抽象数据类型)
程序设计语言
作者
Boyu Dong,Dong Chen,Yu Wu,Siliang Tang,Yueting Zhuang
标识
DOI:10.1109/tnnls.2024.3350660
摘要
With the increasing demand for data privacy, federated learning (FL) has gained popularity for various applications. Most existing FL works focus on the classification task, overlooking those scenarios where anomaly detection may also require privacy-preserving. Traditional anomaly detection algorithms cannot be directly applied to the FL setting due to false and missing detection issues. Moreover, with common aggregation methods used in FL (e.g., averaging model parameters), the global model cannot keep the capacities of local models in discriminating anomalies deviating from local distributions, which further degrades the performance. For the aforementioned challenges, we propose Federated Anomaly Detection with Noisy Global Density Estimation, and Self-supervised Ensemble Distillation (FADngs). Specifically, FADngs aligns the knowledge of data distributions from each client by sharing processed density functions. Besides, FADngs trains local models in an improved contrastive learning way that learns more discriminative representations specific for anomaly detection based on the shared density functions. Furthermore, FADngs aggregates capacities by ensemble distillation, which distills the knowledge learned from different distributions to the global model. Our experiments demonstrate that the proposed method significantly outperforms state-of-the-art federated anomaly detection methods. We also empirically show that the shared density function is privacy-preserving. The code for the proposed method is provided for research purposes https://github.com/kanade00/Federated_Anomaly_detection.
科研通智能强力驱动
Strongly Powered by AbleSci AI