CDEST: Class Distinguishability-Enhanced Self-Training Method for Adopting Pre-Trained Models to Downstream Remote Sensing Image Semantic Segmentation

计算机科学 班级(哲学) 培训(气象学) 分割 遥感 人工智能 图像(数学) 计算机视觉 地质学 气象学 地理
作者
Ming Zhang,Xiaodong Gu,Jing Qi,Zhenshi Zhang,Huaidong Yang,Jian Xu,Chengli Peng,Haifeng Li
出处
期刊:Remote Sensing [MDPI AG]
卷期号:16 (7): 1293-1293
标识
DOI:10.3390/rs16071293
摘要

The self-supervised learning (SSL) technique, driven by massive unlabeled data, is expected to be a promising solution for semantic segmentation of remote sensing images (RSIs) with limited labeled data, revolutionizing transfer learning. Traditional ‘local-to-local’ transfer from small, local datasets to another target dataset plays an ever-shrinking role due to RSIs’ diverse distribution shifts. Instead, SSL promotes a ‘global-to-local’ transfer paradigm, in which generalized models pre-trained on arbitrarily large unlabeled datasets are fine-tuned to the target dataset to overcome data distribution shifts. However, the SSL pre-trained models may contain both useful and useless features for the downstream semantic segmentation task, due to the gap between the SSL tasks and the downstream task. To adapt such pre-trained models to semantic segmentation tasks, traditional supervised fine-tuning methods that use only a small number of labeled samples may drop out useful features due to overfitting. The main reason behind this is that supervised fine-tuning aims to map a few training samples from the high-dimensional, sparse image space to the low-dimensional, compact semantic space defined by the downstream labels, resulting in a degradation of the distinguishability. To address the above issues, we propose a class distinguishability-enhanced self-training (CDEST) method to support global-to-local transfer. First, the self-training module in CDEST introduces a semi-supervised learning mechanism to fully utilize the large amount of unlabeled data in the downstream task to increase the size and diversity of the training data, thus alleviating the problem of biased overfitting of the model. Second, the supervised and semi-supervised contrastive learning modules of CDEST can explicitly enhance the class distinguishability of features, helping to preserve the useful features learned from pre-training while adapting to downstream tasks. We evaluate the proposed CDEST method on four RSI semantic segmentation datasets, and our method achieves optimal experimental results on all four datasets compared to supervised fine-tuning as well as three semi-supervised fine-tuning methods.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
1秒前
Lucifer完成签到,获得积分10
3秒前
feifeifeifei完成签到,获得积分10
3秒前
1234发布了新的文献求助10
4秒前
4秒前
林地王国小安妮完成签到,获得积分10
4秒前
荀之玉发布了新的文献求助10
4秒前
Juvigate完成签到,获得积分20
5秒前
5秒前
5秒前
李小奎发布了新的文献求助10
5秒前
5秒前
专注ing发布了新的文献求助30
5秒前
gsc完成签到,获得积分10
5秒前
Lee完成签到,获得积分10
6秒前
6秒前
Zhihu发布了新的文献求助30
6秒前
6秒前
李立烯发布了新的文献求助10
7秒前
7秒前
7秒前
8秒前
科研小越发布了新的文献求助10
9秒前
9秒前
9秒前
姚芭蕉完成签到 ,获得积分0
9秒前
10秒前
123发布了新的文献求助10
10秒前
star发布了新的文献求助10
10秒前
ysr发布了新的文献求助10
11秒前
lunan发布了新的文献求助10
11秒前
胡小月发布了新的文献求助10
12秒前
小埋发布了新的文献求助10
12秒前
hhh完成签到,获得积分10
12秒前
AustinT发布了新的文献求助10
12秒前
流星雨发布了新的文献求助10
12秒前
nimama发布了新的文献求助10
15秒前
Hello应助老解采纳,获得10
15秒前
15秒前
高分求助中
The three stars each : the Astrolabes and related texts 1070
Manual of Clinical Microbiology, 4 Volume Set (ASM Books) 13th Edition 1000
Sport in der Antike 800
Aspect and Predication: The Semantics of Argument Structure 666
De arte gymnastica. The art of gymnastics 600
少脉山油柑叶的化学成分研究 530
Sport in der Antike Hardcover – March 1, 2015 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2409299
求助须知:如何正确求助?哪些是违规求助? 2105247
关于积分的说明 5316519
捐赠科研通 1832721
什么是DOI,文献DOI怎么找? 913191
版权声明 560738
科研通“疑难数据库(出版商)”最低求助积分说明 488289