Accumulated Trivial Attention Matters in Vision Transformers on Small Datasets

计算机科学 变压器 人工智能 计算 感知器 机器学习 人工神经网络 模式识别(心理学) 算法 物理 量子力学 电压
作者
Xiangyu Chen,Qinghao Hu,Kaidong Li,Cuncong Zhong,Guanghui Wang
标识
DOI:10.1109/wacv56688.2023.00397
摘要

Vision Transformers has demonstrated competitive performance on computer vision tasks benefiting from their ability to capture long-range dependencies with multi-head self-attention modules and multi-layer perceptron. However, calculating global attention brings another disadvantage compared with convolutional neural networks, i.e. requiring much more data and computations to converge, which makes it difficult to generalize well on small datasets, which is common in practical applications. Previous works are either focusing on transferring knowledge from large datasets or adjusting the structure for small datasets. After carefully examining the self-attention modules, we discover that the number of trivial attention weights is far greater than the important ones and the accumulated trivial weights are dominating the attention in Vision Transformers due to their large quantity, which is not handled by the attention itself. This will cover useful non-trivial attention and harm the performance when trivial attention includes more noise, e.g. in shallow layers for some backbones. To solve this issue, we proposed to divide attention weights into trivial and non-trivial ones by thresholds, then Suppressing Accumulated Trivial Attention (SATA) weights by proposed Trivial WeIghts Suppression Transformation (TWIST) to reduce attention noise. Extensive experiments on CIFAR-100 and Tiny-ImageNet datasets show that our suppressing method boosts the accuracy of Vision Transformers by up to 2.3%. Code is available at https://github.com/xiangyu8/SATA.

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
2秒前
无限的千凝完成签到 ,获得积分10
2秒前
2秒前
老秦发布了新的文献求助10
5秒前
kyJYbs发布了新的文献求助10
8秒前
9秒前
无奈芸应助文件撤销了驳回
14秒前
16秒前
在水一方应助无处不在采纳,获得10
17秒前
25秒前
orixero应助勤劳的鹤轩采纳,获得30
27秒前
27秒前
yuan完成签到,获得积分10
28秒前
ncwgx完成签到,获得积分10
30秒前
30秒前
physicalproblem完成签到,获得积分10
32秒前
焰火青年发布了新的文献求助30
33秒前
沐风驳回了cai应助
33秒前
37秒前
38秒前
38秒前
霸气的匕完成签到 ,获得积分10
38秒前
勤劳的鹤轩完成签到,获得积分10
39秒前
桐桐完成签到,获得积分0
40秒前
无处不在发布了新的文献求助10
42秒前
Rafayel发布了新的文献求助10
43秒前
43秒前
49秒前
55秒前
油菜花完成签到,获得积分10
55秒前
56秒前
枫叶荻花完成签到 ,获得积分10
58秒前
领导范儿应助图图采纳,获得20
59秒前
59秒前
123完成签到 ,获得积分10
59秒前
景景好完成签到,获得积分10
1分钟前
家妙彤发布了新的文献求助30
1分钟前
FashionBoy应助tw007007采纳,获得10
1分钟前
曲奇发布了新的文献求助10
1分钟前
A_Caterpillar完成签到,获得积分10
1分钟前
高分求助中
【此为提示信息,请勿应助】请按要求发布求助,避免被关 20000
Continuum Thermodynamics and Material Modelling 2000
Encyclopedia of Geology (2nd Edition) 2000
105th Edition CRC Handbook of Chemistry and Physics 1600
Maneuvering of a Damaged Navy Combatant 650
Mixing the elements of mass customisation 300
the MD Anderson Surgical Oncology Manual, Seventh Edition 300
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3777983
求助须知:如何正确求助?哪些是违规求助? 3323609
关于积分的说明 10215097
捐赠科研通 3038781
什么是DOI,文献DOI怎么找? 1667645
邀请新用户注册赠送积分活动 798329
科研通“疑难数据库(出版商)”最低求助积分说明 758315