联营
计算机科学
分割
比例(比率)
人工智能
模式识别(心理学)
卷积(计算机科学)
航程(航空)
保险丝(电气)
一般化
网(多面体)
人工神经网络
特征(语言学)
数学
数学分析
物理
量子力学
语言学
哲学
材料科学
几何学
电气工程
复合材料
工程类
作者
Chi Zhang,Jingben Lu,Qianqian Hua,Chunguo Li,Pengwei Wang
标识
DOI:10.1016/j.bspc.2021.103460
摘要
In liver tumor segmentation tasks, the problems of multi-scale and global spatial modeling significantly affect the segmentation accuracy. For multi-scale feature extraction, we propose a dynamic scale attention mechanism, which assigns adaptive weights to multi-scale convolutions. Scale Attention could fuse receptive fields from multiple scales, which is beneficial to segmentation of multi-scale targets. For global modeling of spatial information, Axis Attention is proposed, which optimizes the computational resources utilization of self-attention and the attentive effect of convolution attention simultaneously. Axis Attention could model spatial long-range dependencies effectively and efficiently. Scale Attention and Axis Attention are organically combined with a style of adaptive global pooling and the composite proposed mechanism is called Scale-Axis-Attention (SAA). We incorporate it into U-shaped network to improve the performance of liver tumor segmentation, termed as SAA-Net. Our method not only is far superior to self-attention in terms of the computational resources utilization, but also incorporates the scale and spatial attention mechanisms simultaneously for performance improvement. We show that SAA-Net achieves the improved model capability and generalization performance through extensive experiments on qualitative and quantitative test datasets. Experiments also demonstrate the effectiveness of our method in the segmentation of tumors with small size.
科研通智能强力驱动
Strongly Powered by AbleSci AI