服装
计算机科学
风格(视觉艺术)
人工智能
图像合成
预处理器
解析
过程(计算)
人工神经网络
图像(数学)
计算机视觉
考古
历史
操作系统
作者
Bo-Kyeong Kim,Geonmin Kim,Soo-Young Lee
标识
DOI:10.1109/tmm.2019.2929000
摘要
We propose an approach for digitally altering people's outfits in images. Given images of a person and a desired clothing style, our method generates a new clothing item image. The new item displays the color and pattern of the desired style while geometrically mimicking the person's original item. Through superimposition, the altered image is made to look as if the person is wearing the new item. Unlike recent works with full-image synthesis, our work relies on segment synthesis, yielding benefits in virtual try-on. For the synthesis process, we assume two underlying factors characterizing clothing segments: geometry and style. These two factors are disentangled via preprocessing and combined using a neural network. We explore several networks and introduce important aspects of the architecture and learning process. Our experimental results are three-fold: 1) on images from fashion-parsing datasets, we demonstrate the generation of high-quality clothing segments with fine-level style control; 2) on a virtual try-on benchmark, our method shows superiority over prior synthesis methods; and 3) in transferring clothing styles, we visualize the differences between our method and neural style transfer.
科研通智能强力驱动
Strongly Powered by AbleSci AI