人工智能
计算机视觉
图像纹理
计算机科学
纹理(宇宙学)
模式识别(心理学)
图像分割
图像分辨率
图像(数学)
图像处理
作者
Fan Fan,Yang Zhao,Yuan Chen,Nannan Li,Wei Jia,Ronggang Wang
标识
DOI:10.1109/tpami.2025.3545571
摘要
In the image super-resolution (SR) field, recovering missing high-frequency textures has always been an important goal. However, deep SR networks based on pixel-level constraints tend to focus on stable edge details and cannot effectively restore random high-frequency textures. It was not until the emergence of the generative adversarial network (GAN) that GAN-based SR models achieved realistic texture restoration and quickly became the mainstream method for texture SR. However, GAN-based SR models still have some drawbacks, such as relying on a large number of parameters and generating fake textures that are inconsistent with ground truth. Inspired by traditional texture analysis research, this paper proposes a novel SR network based on local texture pattern estimation (LTPE), which can restore fine high-frequency texture details without GAN. A differentiable local texture operator is firstly designed to extract local texture structures, and a texture enhancement branch is used to predict the high-resolution local texture distribution based on the LTPE. Then, the predicted high-resolution texture structure map can be used as a reference for the texture fusion SR branch to obtain high-quality texture reconstruction. Finally, loss and Gram loss are simultaneously used to optimize the network. Experimental results demonstrate that the proposed method can effectively recover high-frequency texture without using GAN structures. In addition, the restored high-frequency details are constrained by local texture distribution, thereby reducing significant errors in texture generation. The proposed algorithm will be open-sourced.
科研通智能强力驱动
Strongly Powered by AbleSci AI