计算机科学
风格(视觉艺术)
程式化事实
人工神经网络
掉期(金融)
人工智能
历史
考古
财务
经济
宏观经济学
作者
Quan Wang,Sheng Li,Zichi Wang,Xinpeng Zhang,Guorui Feng
标识
DOI:10.1109/tmm.2023.3281087
摘要
Despite the great success of deep neural networks for style transfer tasks, the entanglement of content and style in images leads to more style information not being captured. To tackle this problem, a novel style disentanglement network is proposed to transfer multi-source style elements. Specifically, we specialize in designing a learnable content style separation module, which can efficiently extract content and style components from images in the latent space. This method differs from the previous approaches by predefining content and style layers in the network. Under the condition of content and style separation, we continue to propose the multi-style swap module, which allows the content image to match more style elements. Additionally, by introducing alternate training strategies for the main and auxiliary decoders as well as style disentanglement loss, the stylized results look very similar to the original artworks. Experimental results demonstrate the superiority of our proposed method compared with existing schemes.
科研通智能强力驱动
Strongly Powered by AbleSci AI