保险丝(电气)
计算机科学
高光谱成像
合成孔径雷达
数据挖掘
块(置换群论)
人工智能
特征提取
航程(航空)
特征(语言学)
传感器融合
模式识别(心理学)
哲学
材料科学
复合材料
几何学
工程类
电气工程
语言学
数学
作者
Junjie Wang,Wei Li,Yunhao Gao,Mengmeng Zhang,Ran Tao,Qian Du
标识
DOI:10.1109/tnnls.2022.3171572
摘要
Due to the limitations of single-source data, joint classification using multisource remote sensing data has received increasing attention. However, existing methods still have certain shortcomings when faced with feature extraction from single-source data and feature fusion between multisource data. In this article, a method based on multiscale interactive information extraction (MIFNet) for hyperspectral and synthetic aperture radar (SAR) image classification is proposed. First, a multiscale interactive information extraction (MIIE) block is designed to extract meaningful multiscale information. Compared with traditional multiscale models, it can not only obtain richer scale information but also reduce the model parameters and lower the network complexity. Furthermore, a global dependence fusion module (GDFM) is developed to fuse features from multisource data, which implements cross attention between multisource data from a global perspective and captures long-range dependence. Extensive experiments on the three datasets demonstrate the superiority of the proposed method and the necessity of each module for accuracy improvement.
科研通智能强力驱动
Strongly Powered by AbleSci AI