Improved Transformer With Multi-Head Dense Collaboration

计算机科学 变压器 判决 推论 人工智能 机器翻译 语言理解 自然语言处理 机器学习 工程类 电压 电气工程
作者
Huadong Wang,Xin Shen,Mei Tu,Yimeng Zhuang,Zhiyuan Liu
出处
期刊:IEEE/ACM transactions on audio, speech, and language processing [Institute of Electrical and Electronics Engineers]
卷期号:30: 2754-2767 被引量:7
标识
DOI:10.1109/taslp.2022.3199648
摘要

Recently, the attention mechanism boosts the performance of many neural network models in Natural Language Processing (NLP). Among the various attention mechanisms, Multi-Head Attention (MHA) is a powerful and popular variant. MHA helps the model to attend to different feature subspaces independently which is an essential component of Transformer. Despite its success, we conjecture that the different heads of the existing MHA may not collaborate properly. To validate this assumption and further improve the performance of Transformer, we study the collaboration problem for MHA in this paper. First, we propose the Single-Layer Collaboration (SLC) mechanism to help each attention head improve its attention distribution based on the feedback of other heads. Furthermore, we extend SLC to the cross-layer Multi-Head Dense Collaboration (MHDC) mechanism. MHDC helps each MHA layer learn the attention distributions considering the knowledge from the other MHA layers. Both SLC and MHDC are implemented as lightweight modules with very few additional parameters. When equipped with these modules, our new framework, i.e., Collaborative TransFormer (CollFormer), significantly outperforms the vanilla Transformer on a range of NLP tasks, including machine translation, sentence semantic relatedness, natural language inference, sentence classification, and reading comprehension. Besides, we also carry out extensive quantitative experiments to analyze the properties of the MHDC in different settings. The experimental results validate the effectiveness and universality of MHDC as well as CollFormer.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
2秒前
4秒前
8秒前
碎碎念s完成签到,获得积分10
8秒前
10秒前
11秒前
飞行日记完成签到,获得积分10
11秒前
miao完成签到,获得积分10
11秒前
yue957发布了新的文献求助10
12秒前
hi应助和谐雨竹采纳,获得10
14秒前
为什么不学习完成签到,获得积分10
14秒前
15秒前
科目三应助miao采纳,获得10
16秒前
情怀应助STP顶峰相见采纳,获得10
16秒前
17秒前
18秒前
19秒前
20秒前
研友_gnv61n完成签到,获得积分0
20秒前
所所应助axlyjia采纳,获得10
21秒前
22秒前
ZiXuanCui发布了新的文献求助10
24秒前
郭凌云发布了新的文献求助60
25秒前
刺五加发布了新的文献求助10
26秒前
小白白发布了新的文献求助10
27秒前
张流筝发布了新的文献求助10
28秒前
linkman发布了新的文献求助30
29秒前
好好好完成签到,获得积分10
30秒前
tcmlida完成签到,获得积分10
32秒前
和谐凡英发布了新的文献求助10
32秒前
34秒前
1900完成签到,获得积分10
35秒前
yang完成签到,获得积分10
37秒前
1900发布了新的文献求助10
39秒前
40秒前
Scalpel完成签到 ,获得积分10
41秒前
41秒前
41秒前
酷波er应助GL采纳,获得10
42秒前
听说现在你成了大锦鲤完成签到,获得积分10
42秒前
高分求助中
Ophthalmic Equipment Market by Devices(surgical: vitreorentinal,IOLs,OVDs,contact lens,RGP lens,backflush,diagnostic&monitoring:OCT,actorefractor,keratometer,tonometer,ophthalmoscpe,OVD), End User,Buying Criteria-Global Forecast to2029 2000
A new approach to the extrapolation of accelerated life test data 1000
Cognitive Neuroscience: The Biology of the Mind 1000
Technical Brochure TB 814: LPIT applications in HV gas insulated switchgear 1000
Nucleophilic substitution in azasydnone-modified dinitroanisoles 500
不知道标题是什么 500
A Preliminary Study on Correlation Between Independent Components of Facial Thermal Images and Subjective Assessment of Chronic Stress 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 生物化学 物理 内科学 纳米技术 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 冶金 细胞生物学 免疫学
热门帖子
关注 科研通微信公众号,转发送积分 3965258
求助须知:如何正确求助?哪些是违规求助? 3510593
关于积分的说明 11154128
捐赠科研通 3244907
什么是DOI,文献DOI怎么找? 1792684
邀请新用户注册赠送积分活动 873943
科研通“疑难数据库(出版商)”最低求助积分说明 804126