亲爱的研友该休息了!由于当前在线用户较少,发布求助请尽量完整地填写文献信息,科研通机器人24小时在线,伴您度过漫漫科研夜!身体可是革命的本钱,早点休息,好梦!

Semi-Supervised Feature Distillation and Unsupervised Domain Adversarial Distillation for Underwater Image Enhancement

蒸馏 人工智能 计算机科学 模式识别(心理学) 对抗制 特征(语言学) 特征提取 图像(数学) 领域(数学分析) 机器学习 计算机视觉 数学 化学 色谱法 数学分析 语言学 哲学
作者
Nianzu Qiao,Changyin Sun,Lu Dong,Quanbo Ge
出处
期刊:IEEE Transactions on Circuits and Systems for Video Technology [Institute of Electrical and Electronics Engineers]
卷期号:34 (8): 7671-7682 被引量:3
标识
DOI:10.1109/tcsvt.2024.3378252
摘要

At present, deep learning has demonstrated outstanding performance in the area of underwater image enhancement. However, these approaches often demand substantial computational resources and extended training time. Knowledge distillation is a widely used technique for model compression, and nowadays it has delivered outstanding results across various fields. However, it has not been utilized in the field of underwater image enhancement. To tackle the aforementioned issues, this paper introduces a knowledge distillation technique for underwater image enhancement for the first time. It is a semi-supervised self-inter feature distillation and unsupervised self-domain adversarial distillation approach. It specifically includes adaptive local self-feature distillation technique, information lossless multi-scale inter-feature distillation technique, and self-domain adversarial distillation approach in LAB-RGB space. Self-feature distillation enhances the performance of the student network by correcting other lossy feature maps with the maximum effective feature map. Inter-feature distillation enables the student network to maximize the potential information learned from the teacher network. Furthermore, an information loss-free pooling approach is suggested to achieve multi-scale loss-free information extraction. Self-domain adversarial distillation boosts the performance of student networks through unsupervised adaptive enhancement in LAB space and unsupervised domain adversarial distillation in RGB space. Finally, a self-inter alternate knowledge distillation training measure is proposed, aiming to maximize the respective benefits of self-inter knowledge distillation. Through extensive comparative experiments, it can be found that student networks with dissimilar structures trained using the knowledge distillation technique designed in this paper achieve outstanding underwater image enhancement results.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
Criminology34应助VDC采纳,获得10
9秒前
Criminology34举报曾经绮南求助涉嫌违规
36秒前
大个应助Boren采纳,获得10
1分钟前
melo完成签到,获得积分10
1分钟前
MchemG应助ceeray23采纳,获得20
1分钟前
科研通AI2S应助ceeray23采纳,获得20
1分钟前
科研通AI2S应助科研通管家采纳,获得30
1分钟前
科研通AI2S应助ceeray23采纳,获得20
1分钟前
1分钟前
SciGPT应助ceeray23采纳,获得20
1分钟前
领导范儿应助ceeray23采纳,获得20
1分钟前
1分钟前
Wanda发布了新的文献求助10
1分钟前
刘机智完成签到,获得积分10
1分钟前
1分钟前
Wanda完成签到,获得积分10
1分钟前
刘机智发布了新的文献求助10
1分钟前
嘟嘟嘟嘟完成签到 ,获得积分10
1分钟前
2分钟前
2分钟前
2分钟前
康康发布了新的文献求助10
2分钟前
完美世界应助康康采纳,获得10
2分钟前
李健应助keke采纳,获得10
2分钟前
2分钟前
Boren发布了新的文献求助10
2分钟前
2分钟前
阳光萌萌发布了新的文献求助30
2分钟前
2分钟前
Boren完成签到,获得积分10
2分钟前
隐形曼青应助MAXXIN采纳,获得10
2分钟前
keke发布了新的文献求助10
2分钟前
2分钟前
Criminology34举报火山求助涉嫌违规
2分钟前
2分钟前
汉堡包应助Sharin采纳,获得10
2分钟前
小二郎应助ceeray23采纳,获得20
3分钟前
MAXXIN发布了新的文献求助10
3分钟前
Criminology34应助阳光萌萌采纳,获得10
3分钟前
FashionBoy应助ceeray23采纳,获得20
3分钟前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Basic And Clinical Science Course 2025-2026 3000
人脑智能与人工智能 1000
花の香りの秘密―遺伝子情報から機能性まで 800
Silicon in Organic, Organometallic, and Polymer Chemistry 500
Principles of Plasma Discharges and Materials Processing, 3rd Edition 400
Pharmacology for Chemists: Drug Discovery in Context 400
热门求助领域 (近24小时)
化学 材料科学 生物 医学 工程类 计算机科学 有机化学 物理 生物化学 纳米技术 复合材料 内科学 化学工程 人工智能 催化作用 遗传学 数学 基因 量子力学 物理化学
热门帖子
关注 科研通微信公众号,转发送积分 5606566
求助须知:如何正确求助?哪些是违规求助? 4691052
关于积分的说明 14866803
捐赠科研通 4707818
什么是DOI,文献DOI怎么找? 2542899
邀请新用户注册赠送积分活动 1508211
关于科研通互助平台的介绍 1472276