参数化复杂度
计算机科学
频道(广播)
卷积神经网络
卷积(计算机科学)
推论
比例(比率)
边距(机器学习)
核(代数)
算法
理论计算机科学
人工智能
计算机工程
人工神经网络
机器学习
数学
电信
组合数学
物理
量子力学
作者
Yincong Wang,Shoubiao Tan,Chunyu Peng
标识
DOI:10.1109/cipcv58883.2023.00009
摘要
In this paper, a compact and effective attention module for convolutional neural networks (CNNs), called re-parameterized global channel interaction attention module (GCIAM), is presented to achieve better communication of channels. Firstly, we propose the global channel receptive filed (GCRF) to measure the number of adjacent channels participating in inter-channel interactions. Secondly, we adopt multi-scale 1-D convolutions to achieve better coverage of the GCRF. Specifically, our method aggregates the information along channel and then encodes cross-channel relationships via multi-scale 1-D convolutions consist of large kernels and small kernels, which enables easier access to different scales of interactions from a large GCRF and significantly reduces the model complexity. Finally, we develop a structural re-parameterization technique for multi-scale 1-D convolutions to boost the inference speed. Extensive experiments on standard benchmarks verify the superiority of our proposed method over various state-of-the-art (SOTA) counterparts. Our method stably outperforms other attention mechanisms by a large margin at almost no costs.
科研通智能强力驱动
Strongly Powered by AbleSci AI