SA-Net: Shuffle Attention for Deep Convolutional Neural Networks

失败 计算机科学 架空(工程) 特征(语言学) 卷积神经网络 人工智能 成对比较 频道(广播) 光学(聚焦) 人工神经网络 目标检测 模式识别(心理学) 机器学习 深度学习 依赖关系(UML) 对象(语法) 计算机工程 并行计算 计算机网络 哲学 物理 光学 操作系统 语言学
作者
Qinglong Zhang,Yu-Bin Yang
标识
DOI:10.1109/icassp39728.2021.9414568
摘要

Attention mechanisms, which enable a neural network to accurately focus on all the relevant elements of the input, have become an essential component to improve the performance of deep neural networks. There are mainly two attention mechanisms widely used in computer vision studies, spatial attention and channel attention, which aim to capture the pixel-level pairwise relationship and channel dependency, respectively. Although fusing them together may achieve better performance than their individual implementations, it will inevitably increase the computational overhead. In this paper, we propose an efficient Shuffle Attention (SA) module to address this issue, which adopts Shuffle Units to combine two types of attention mechanisms effectively. Specifically, SA first groups channel dimensions into multiple sub-features before processing them in parallel. Then, for each sub-feature, SA utilizes a Shuffle Unit to depict feature dependencies in both spatial and channel dimensions. After that, all sub-features are aggregated and a "channel shuffle" operator is adopted to enable information communication between different sub-features. The proposed SA module is efficient yet effective, e.g., the parameters and computations of SA against the backbone ResNet50 are 300 vs. 25.56M and 2.76e-3 GFLOPs vs. 4.12 GFLOPs, respectively, and the performance boost is more than 1.34% in terms of Top-1 accuracy. Extensive experimental results on common-used benchmarks, including ImageNet-1k for classification, MS COCO for object detection, and instance segmentation, demonstrate that the proposed SA outperforms the current SOTA methods significantly by achieving higher accuracy while having lower model complexity.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
t通应助bofu采纳,获得10
刚刚
哒哒发布了新的文献求助10
1秒前
1秒前
Wolfgang发布了新的文献求助30
2秒前
Zp关闭了Zp文献求助
2秒前
2秒前
3秒前
heart同学发布了新的文献求助10
3秒前
852应助虚心的岩采纳,获得10
3秒前
5秒前
5秒前
xiaooooo完成签到,获得积分20
5秒前
BJYX发布了新的文献求助20
5秒前
科研小白完成签到,获得积分10
6秒前
周小鱼发布了新的文献求助10
6秒前
xxxxxxh发布了新的文献求助10
8秒前
Rational发布了新的文献求助10
8秒前
9秒前
9秒前
跑快点发布了新的文献求助10
10秒前
卡皮巴拉发布了新的文献求助10
10秒前
Billy应助bofu采纳,获得30
10秒前
想看文献的人完成签到,获得积分10
11秒前
科研通AI5应助nana采纳,获得10
11秒前
summitekey完成签到 ,获得积分10
12秒前
善学以致用应助lyz666采纳,获得10
12秒前
平淡妙松完成签到,获得积分20
12秒前
山乞凡完成签到 ,获得积分10
13秒前
充电宝应助飞天大野猪采纳,获得10
13秒前
13秒前
13秒前
xxxxxxh完成签到,获得积分10
13秒前
14秒前
14秒前
14秒前
张子烜发布了新的文献求助10
14秒前
思源应助112233采纳,获得10
15秒前
15秒前
yuan发布了新的文献求助10
16秒前
跑快点完成签到,获得积分10
16秒前
高分求助中
The world according to Garb 600
Разработка метода ускоренного контроля качества электрохромных устройств 500
Mass producing individuality 500
Chinesen in Europa – Europäer in China: Journalisten, Spione, Studenten 500
Arthur Ewert: A Life for the Comintern 500
China's Relations With Japan 1945-83: The Role of Liao Chengzhi // Kurt Werner Radtke 500
Two Years in Peking 1965-1966: Book 1: Living and Teaching in Mao's China // Reginald Hunt 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3820683
求助须知:如何正确求助?哪些是违规求助? 3363576
关于积分的说明 10423882
捐赠科研通 3081997
什么是DOI,文献DOI怎么找? 1695408
邀请新用户注册赠送积分活动 815083
科研通“疑难数据库(出版商)”最低求助积分说明 768856