感知
水下
环境友好型
图像(数学)
计算机科学
生成语法
人工智能
心理学
神经科学
生物
地质学
生态学
海洋学
作者
Weiming Li,Xuelong Wu,Shuaishuai Fan,Songjie Wei,Glyn Gowing
标识
DOI:10.1109/tnnls.2025.3539841
摘要
The key requirement for underwater image enhancement (UIE) is to overcome the unpredictable color degradation caused by the underwater environment and light attenuation, while addressing issues, such as color distortion, reduced contrast, and blurring. However, most existing unsupervised methods fail to effectively solve these problems, resulting in a visual disparity in metric-optimal qualitative results compared with undegraded images. In this work, we propose an implicit neural-guided cyclic generative model for UIE tasks, and the bidirectional mapping structure solves the aforementioned ill-posed problem from the perspective of bridging the gap between the metric-favorable and the perceptual-friendly versions. The multiband-aware implicit neural normalization effectively alleviates the degradation distribution. The U-shaped generator simulates human visual attention mechanisms, which enables the aggregation of global coarse-grained and local fine-grained features, and enhances the texture and edge features under the guidance of shallow semantics. The discriminator ensures perception-friendly visual results through a dual-branch structure via appearance and color. Extensive experiments and ablation analyses on the full-reference and nonreference underwater benchmarks demonstrate the superiority of our proposed method. It can restore degraded images in most underwater scenes with good generalization and robustness, and the code is available at https://github.com/SUIEDDM/INGC-GAN.
科研通智能强力驱动
Strongly Powered by AbleSci AI