Virtual trial room

Hey all
I am trying to create a virtual trial room where someone can try some dress.
Could anyone provide me some resources!

I have been working on virtual try-on of clothing for 3 recent years. You can start with 2D based virtual try-on.
0. Download stage of the art method such as ACGPN, PFAFN, VITON_HD, or CPVTON+.

  1. Build a server to run one of those methods and a web service.
  2. Build a web or app to call those services.


  1. Those methods are trained on a specific dataset. You must preprocess your inputs to the same format as the training dataset.
  2. Those methods require some preprocessing on the input data, such as running open pose to get the key joints, clothing mask, and Human semantic segmentation to get the customer parsing.

Then, you have a demo virtual try-on system.

To make it commercial, you have to read those papers and figure out some constraints on the customer inputs such as pose, camera viewpoints,… Then you can make it commercial.

Or, to improve it to 3D-based virtual try on and apply it to the metaverse, you should think about optimization steps of 3D reconstruction (precalculating everything is a way, make use as much offline processing as possible). Then, apply the method in 3D-MPVTON to make it realtime.

Thai Tuan.

Virtual try on

  1. First generation:
    In the past, there was a fitting room. A customer came to the shop and went into the room. The room, with cameras and sensors, can measure customer information, and the 3D body of the customer was built.
    Comment: Expensive (money to build the room with cameras and sensors); Not user-friendly (Customers must go to the shop at least once); High accuracy in building the 3D human body.

  2. 2D-based virtual try-on: The second generation
    VITON, CPVTON, and CPVTON+ are the method based on 2D with two steps. First, in-shop clothing is warp to fit onto the customer’s body. Second, warped clothing and customer photo are blended to get the final try-on.
    Comment: Fast, low-resolution input, Work well in simple pose for customer and simple clothing.

  3. 2D-based virtual try-on: The third generation
    ACGPN, VTNFP, and VITONHD… are methods based on 2D. First, a guide segmentation of the final virtual try-on result is generated. This segmentation is used for guilding to warping in shop clothing and blending to generate a final try-on.
    Comment: Fast, VITONHD can generate high-resolution output, and clear texture and can handle body in the complex pose but not clothing.

  4. 3D-based virtual try-on:
    CLoTHVTON, CloTHVTON+, and 3DMPVTON are based on 3D. Improve previous methods and try to deform warped clothing in 3D so that it can handle complex poses.
    Comment: High complexity, different clothing types need different techniques to reconstruct 3D clothing. Need to process it offline to make it real-time.

Thai Tuan.

Hello @Shubham_Biswas , welcome to the DeepAR forum! :wave:

We are currently working on full-body tracking, when this is ready, you will be able to create a try-on experience for clothing and accessories (coats, dresses, bags).

Right now, you can create try-on experiences relating to the head & face, like glasses, hats, earrings, makeup and even hair transplants!

We recently released foot-tracking support, so you can also build shoe try-on experiences with DeepAR.

We’re also working on wrist tracking, for watch & jewelry try-on.

We’ll keep you updated in this post as more things become ready to use. :smiley:

1 Like

Just waiting for it.