TTST: A Top-k Token Selective Transformer for Remote Sensing Image Super-Resolution

安全性令牌 计算机科学 变压器 人工智能 模式识别(心理学) 电压 工程类 计算机网络 电气工程
作者
Yi Xiao,Qiangqiang Yuan,Kui Jiang,Jiang He,Chia‐Wen Lin,Liangpei Zhang
出处
期刊:IEEE transactions on image processing [Institute of Electrical and Electronics Engineers]
卷期号:: 1-1 被引量:3
标识
DOI:10.1109/tip.2023.3349004
摘要

Transformer-based method has demonstrated promising performance in image super-resolution tasks, due to its long-range and global aggregation capability. However, the existing Transformer brings two critical challenges for applying it in large-area earth observation scenes: (1) redundant token representation due to most irrelevant tokens; (2) single-scale representation which ignores scale correlation modeling of similar ground observation targets. To this end, this paper proposes to adaptively eliminate the interference of irreverent tokens for a more compact self-attention calculation. Specifically, we devise a Residual Token Selective Group (RTSG) to grasp the most crucial token by dynamically selecting the top- k keys in terms of score ranking for each query. For better feature aggregation, a Multi-scale Feed-forward Layer (MFL) is developed to generate an enriched representation of multi-scale feature mixtures during feed-forward process. Moreover, we also proposed a Global Context Attention (GCA) to fully explore the most informative components, thus introducing more inductive bias to the RTSG for an accurate reconstruction. In particular, multiple cascaded RTSGs form our final Top- k Token Selective Transformer (TTST) to achieve progressive representation. Extensive experiments on simulated and real-world remote sensing datasets demonstrate our TTST could perform favorably against state-of-the-art CNN-based and Transformer-based methods, both qualitatively and quantitatively. In brief, TTST outperforms the state-of-the-art approach (HAT-L) in terms of PSNR by 0.14 dB on average, but only accounts for 47.26% and 46.97% of its computational cost and parameters. The code and pre-trained TTST will be available at https://github.com/XY-boy/TTST for validation.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
小程发布了新的文献求助10
刚刚
刚刚
zhang0403发布了新的文献求助10
1秒前
2秒前
徐橙橙完成签到,获得积分10
2秒前
谢清然完成签到,获得积分20
3秒前
脑洞疼应助Husky采纳,获得10
3秒前
3秒前
小蘑菇应助clocksoar采纳,获得10
3秒前
3秒前
qz发布了新的文献求助10
3秒前
lili发布了新的文献求助10
3秒前
4秒前
5秒前
Simone完成签到,获得积分10
5秒前
5秒前
6秒前
sherry完成签到 ,获得积分10
7秒前
7秒前
Simone发布了新的文献求助10
8秒前
maox1aoxin应助QQ采纳,获得30
8秒前
爆米花应助KYN采纳,获得10
8秒前
墨墨完成签到,获得积分10
8秒前
无花果应助qz采纳,获得10
9秒前
敲一下叮发布了新的文献求助10
9秒前
9秒前
隐形曼青应助小程采纳,获得10
9秒前
AK完成签到,获得积分20
9秒前
fujiayi完成签到,获得积分10
9秒前
靓丽紫雪发布了新的文献求助10
10秒前
yangxuan关注了科研通微信公众号
10秒前
风儿发布了新的文献求助10
10秒前
11秒前
11秒前
11秒前
冷酷达完成签到,获得积分10
12秒前
Jasmine发布了新的文献求助10
12秒前
orixero应助wodel采纳,获得10
12秒前
Ycsan完成签到,获得积分20
12秒前
高分求助中
The three stars each : the Astrolabes and related texts 1070
Manual of Clinical Microbiology, 4 Volume Set (ASM Books) 13th Edition 1000
Hieronymi Mercurialis Foroliviensis De arte gymnastica libri sex: In quibus exercitationum omnium vetustarum genera, loca, modi, facultates, & ... exercitationes pertinet diligenter explicatur Hardcover – 26 August 2016 900
Sport in der Antike 800
De arte gymnastica. The art of gymnastics 600
少脉山油柑叶的化学成分研究 530
Sport in der Antike Hardcover – March 1, 2015 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2403770
求助须知:如何正确求助?哪些是违规求助? 2102426
关于积分的说明 5305753
捐赠科研通 1830066
什么是DOI,文献DOI怎么找? 911955
版权声明 560458
科研通“疑难数据库(出版商)”最低求助积分说明 487619