临近预报
解码方法
变压器
计算机科学
稳健性(进化)
编码器
人工智能
算法
地质学
工程类
电气工程
电压
操作系统
基因
化学
海洋学
生物化学
作者
Wenhui Li,Ying Zhou,Yue Li,Dan Song,Zhiqiang Wei,An-An Liu
标识
DOI:10.1109/lgrs.2024.3359229
摘要
The U-net and Transformer have garnered significant attention in precipitation nowcasting due to their impressive capabilities in modeling sequential information. However, the performance is still constrained by the computational complexity of attention mechanism and the persistence of redundant information transmission between encoding and decoding stages. To address above problems, we propose a novel hierarchical transformer with lightweight attention (HTLA) for precipitation nowcasting, which can integrate the Transformer and U-Net architectures to comprehensively explore the intrinsic characteristics of rainfall data with less complexity. Specifically, HTLA incorporates cross-channel self-attention with lightweight and dual feedforward module as fundamental components for encoding and decoding, efficiently fusing the advantages of Transformer and U-Net. A Gaussian pooling skip-connection strategy is proposed to adaptively weight information, effectively suppressing the redundant interference from encoder to the decoder. The experimental results demonstrate the effectiveness and robustness of our HTLA, achieving improvements of 5.6% and 5.1% in terms of CSI and HSS with only 3.6% parameters compared to the state-of-the-art method. The code is available at https://github.com/precipitation/HTLA.
科研通智能强力驱动
Strongly Powered by AbleSci AI