Wp-VTON: A wrinkle-preserving virtual try-on network via clothing texture book.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
Sep 1, 2025
Abstract
Virtual try-on technology seeks to seamlessly integrate an image of a specified garment onto the target person, generating a synthesized image that realistically depicts the person wearing the clothing. Existing methods based on generative adversarial network (GAN) for clothing warping in the generation process usually use human pose- and body parsing-based features to guide the distortion of a flattened clothing item. However, it is hard using these approaches to accurately capture the spatial characteristics of the distorted clothing (e.g., wrinkles on the clothing). In this research, we propose a Wrinkle-Preserving Virtual Try-On Network, named WP-VTON, to address the aforementioned issues exist in the virtual try-on task. Specifically, in the clothing warping stage, we incorporate the normal features extracted from spatial attributes of both clothing and the human body to learn about the clothing deformation caused by warping; in the try-on generation stage, we leverage a pre-trained StyleGAN, called clothing texture book, to optimize the try-on image, with the aim of further improving the generation capability of WP-VTON with regard to texture details. Experimental results in public datasets demonstrate the effectiveness of our method by outperforming the state-of-the-art GAN-based virtual try-on models.