计算机科学
变压器
适配器(计算)
国际商用机器公司
训练集
差别隐私
机器学习
人工智能
人工神经网络
数据挖掘
电压
计算机硬件
工程类
电气工程
纳米技术
材料科学
作者
Xilong Wang,Chia-Mu Yu,Pin‐Yu Chen
出处
期刊:Cornell University - arXiv
日期:2023-01-01
标识
DOI:10.48550/arxiv.2309.06526
摘要
For machine learning with tabular data, Table Transformer (TabTransformer) is a state-of-the-art neural network model, while Differential Privacy (DP) is an essential component to ensure data privacy. In this paper, we explore the benefits of combining these two aspects together in the scenario of transfer learning -- differentially private pre-training and fine-tuning of TabTransformers with a variety of parameter-efficient fine-tuning (PEFT) methods, including Adapter, LoRA, and Prompt Tuning. Our extensive experiments on the ACSIncome dataset show that these PEFT methods outperform traditional approaches in terms of the accuracy of the downstream task and the number of trainable parameters, thus achieving an improved trade-off among parameter efficiency, privacy, and accuracy. Our code is available at github.com/IBM/DP-TabTransformer.
科研通智能强力驱动
Strongly Powered by AbleSci AI