计算机科学
联合学习
领域(数学分析)
学习迁移
Guard(计算机科学)
人工智能
机器学习
分布式计算
数学
数学分析
程序设计语言
作者
Yang Liu,Yan Kang,Chaoping Xing,Tianjian Chen,Qiang Yang
出处
期刊:IEEE Intelligent Systems
[Institute of Electrical and Electronics Engineers]
日期:2020-04-22
卷期号:35 (4): 70-82
被引量:464
标识
DOI:10.1109/mis.2020.2988525
摘要
Machine learning relies on the availability of vast amounts of data for training.However, in reality, data are mostly scattered across different organizations and cannot be easily integrated due to many legal and practical constraints.To address this important challenge in the field of machine learning, we introduce a new technique and framework, known as federated transfer learning (FTL), to improve statistical modeling under a data federation.FTL allows knowledge to be shared without compromising user privacy and enables complementary knowledge to be transferred across domains in a data federation, thereby enabling a target-domain party to build flexible and effective models by leveraging rich labels from a source domain.This framework requires minimal modifications to the existing model structure and provides the same level of accuracy as the non-privacy-preserving transfer learning.It is flexible and can be effectively adapted to various secure multi-party machine learning tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI