计算机科学
保险丝(电气)
残余物
特征(语言学)
块(置换群论)
卷积(计算机科学)
编码器
特征提取
人工智能
图层(电子)
模式识别(心理学)
网(多面体)
算法
人工神经网络
数学
工程类
语言学
哲学
化学
几何学
有机化学
电气工程
操作系统
作者
Won Young Chung,In Ho Lee,Chan Gook Park
标识
DOI:10.1109/lgrs.2023.3276326
摘要
In order to improve detection performance in a U-net-based IR small target detection (IRSTD) algorithm, it is crucial to fuse low-level and high-level features. Conventional algorithms perform feature fusion by adding a convolution layer to the skip pathway of the U-net and by connecting the skip connection densely. However, with the added convolution operation, the number of parameters of the network increases, hence the inference time increases accordingly. Therefore, in this letter, a UNet3+ based full-scale skip connection U-net is used as a base network to lower the computational cost by fusing the feature with a small number of parameters. Moreover, we propose an effective encoder and decoder structure for improved IRSTD performance. A residual attention block is applied to each layer of the encoder for effective feature extraction. As for the decoder, a residual attention block is applied to the feature fusion section to effectively fuse the hierarchical information obtained from each layer. In addition, learning is performed through full-scale deep supervision to reflect all the information obtained from each layer. The proposed algorithm, coined Attention Multiscale feature Fusion U-net (AMFU-net), can hence guarantee effective target detection performance and a lightweight structure (mIoU: 0.7512, FPS: 86.1). Pytorch implementation is available at: github.com/cwon789/AMFU-net.
科研通智能强力驱动
Strongly Powered by AbleSci AI