计算机科学
分割
特征(语言学)
人工智能
模式识别(心理学)
区间(图论)
对比度(视觉)
基质(化学分析)
计算
算法
数学
组合数学
哲学
语言学
材料科学
复合材料
作者
Lang Huang,Yuhui Yuan,Jianyuan Guo,Chao Zhang,Xilin Chen,Jingdong Wang
出处
期刊:Cornell University - arXiv
日期:2019-01-01
被引量:9
标识
DOI:10.48550/arxiv.1907.12273
摘要
In this paper, we present a so-called interlaced sparse self-attention approach to improve the efficiency of the \emph{self-attention} mechanism for semantic segmentation. The main idea is that we factorize the dense affinity matrix as the product of two sparse affinity matrices. There are two successive attention modules each estimating a sparse affinity matrix. The first attention module is used to estimate the affinities within a subset of positions that have long spatial interval distances and the second attention module is used to estimate the affinities within a subset of positions that have short spatial interval distances. These two attention modules are designed so that each position is able to receive the information from all the other positions. In contrast to the original self-attention module, our approach decreases the computation and memory complexity substantially especially when processing high-resolution feature maps. We empirically verify the effectiveness of our approach on six challenging semantic segmentation benchmarks.
科研通智能强力驱动
Strongly Powered by AbleSci AI