Dual adversarial network with meta-learning for domain-generalized few-shot text classification

鉴别器 计算机科学 人工智能 对抗制 元学习(计算机科学) 机器学习 领域(数学分析) 杠杆(统计) 对偶(语法数字) 一般化 发电机(电路理论) 模式识别(心理学) 任务(项目管理) 数学 功率(物理) 经济 量子力学 管理 探测器 电信 艺术 数学分析 文学类 物理
作者
Xuyang Wang,Yajun Du,Danroujing Chen,Xianyong Li,Xiaoliang Chen,Yongquan Fan,Chunzhi Xie,Yanli Li,Jia Liu,Hui Li
出处
期刊:Applied Soft Computing [Elsevier]
卷期号:146: 110697-110697 被引量:1
标识
DOI:10.1016/j.asoc.2023.110697
摘要

Meta-learning-based methods prevail in few-shot text classification. Current methods perform meta-training and meta-testing on two parts of a dataset in the same or similar domains. This results in a significant limit in model performance when faced with data from different domains, limiting the generalization of few-shot models. To solve this problem, this study proposes a new setting, namely, domain-generalized few-shot text classification. First, meta-training is conducted on a multi-domain dataset to learn a generalizable model. Subsequently, the model is meta-tested on a target dataset. In addition, a domain-generalized model, namely, a dual adversarial network, is designed to improve the meta-learning-based methods under domain drift between different datasets and domains. Unlike previous meta-learning methods, two N-way-K-shot tasks were input from different domains for a dual adversarial network at each episode. Dual adversarial networks leverage the features from two different domains for adversarial training to improve the domain adaptability of the model. The proposed model utilizes a domain-knowledge generator during adversarial training to produce domain-specific knowledge, and a domain discriminator to recognize the domain label of the produced knowledge. Extensive experiments are conducted to verify the effectiveness of the proposed settings and model. The experimental results show that the model performance in our proposed setting is improved by an average of 3.84% compared to that in cross-domain few-shot text classification. Furthermore, the dual adversarial network significantly outperforms the five competitive baseline models, with an average improvement of 7.20%. The proposed model achieves an average performance improvement of 2.69% compared with the best baseline method.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
neil_match完成签到,获得积分10
8秒前
斯文败类应助火星上猫咪采纳,获得10
8秒前
10秒前
11秒前
14秒前
16秒前
蜉蝣发布了新的文献求助10
16秒前
吾可发布了新的文献求助10
16秒前
17秒前
微笑的冰烟完成签到,获得积分10
17秒前
拓跋康发布了新的文献求助10
18秒前
18秒前
19秒前
上官若男应助HCX采纳,获得10
19秒前
luchen完成签到,获得积分10
19秒前
20秒前
21秒前
gypsi完成签到,获得积分0
23秒前
23秒前
nian发布了新的文献求助10
23秒前
Naruto完成签到,获得积分10
24秒前
Hello应助无聊的听筠采纳,获得10
24秒前
lxl98发布了新的文献求助10
25秒前
bingle0123发布了新的文献求助10
25秒前
Singularity应助海风采纳,获得20
26秒前
林宥嘉应助immunity1983采纳,获得10
27秒前
suijinichen完成签到 ,获得积分10
27秒前
29秒前
nicole完成签到,获得积分10
29秒前
SOLOMON应助victory_liu采纳,获得10
30秒前
炒蛋汉堡完成签到,获得积分10
31秒前
aaaaa发布了新的文献求助10
32秒前
刘文欣发布了新的文献求助10
36秒前
简让完成签到 ,获得积分10
37秒前
38秒前
39秒前
端庄冬日完成签到,获得积分10
41秒前
小彩云发布了新的文献求助30
42秒前
Akim应助研友_85Y5z8采纳,获得30
43秒前
鳗鱼念蕾完成签到 ,获得积分10
43秒前
高分求助中
请在求助之前详细阅读求助说明!!!! 20000
One Man Talking: Selected Essays of Shao Xunmei, 1929–1939 1000
The Three Stars Each: The Astrolabes and Related Texts 900
Yuwu Song, Biographical Dictionary of the People's Republic of China 800
Multifunctional Agriculture, A New Paradigm for European Agriculture and Rural Development 600
Bernd Ziesemer - Maos deutscher Topagent: Wie China die Bundesrepublik eroberte 500
A radiographic standard of reference for the growing knee 400
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2477333
求助须知:如何正确求助?哪些是违规求助? 2141124
关于积分的说明 5457859
捐赠科研通 1864396
什么是DOI,文献DOI怎么找? 926822
版权声明 562872
科研通“疑难数据库(出版商)”最低求助积分说明 495924