Dress Anyone

A Method and Dataset for 3D Garment Retargeting


Shanthika Naik1, Astitva Srivastava2, Kunwar Singh2, Varun Jampani3, Amit Raj4, Avinash Sharma1
1IITJ, India, 2IIITH, India, 3Stability AI, 4Google Research


Paper | Supplementary | Code







Abstract


3D garment retargeting for digital characters and avatars involves non-rigid deformation of a 3D garment mesh to plausibly fit the target body mesh in a different pose. Existing neural methods for garment simulation/draping make assumption that the 3D garment is initially fitted over the 3D body, and generally require a canonicalized representation of garments, limiting them to parametric settings. In this paper, we present a novel approach to achieve 3D garment retargeting under non-parametric settings. We propose a novel isomap-based representation to first estimate robust correspondences between garment and body mesh to achieve an initial coarse retargeting, followed by a fast and efficient neural optimization, governed by Physics-based constraints. The proposed framework enables a fast inference pipeline and quick optimization for any 3D garment. We perform extensive experiments on publicly available datasets and our new dataset of 3D clothing and report superior quantitative and qualitative results in comparison to SOTA methods, while demonstrating new capabilities.




Methodology



Architecture






Embedding Calculations






Results



Loose Garments






Human Meshes






3D Bicar Charactors






Layered Garments






Internet Images






Comparison



DIG





Drapenet





Our Dataset






BibTex


@inproceedings{naik2024dressmeupdatasetmethod,
      title={Dress-Me-Up: A Dataset & Method for Self-Supervised 3D Garment Retargeting}, 
      author={Shanthika Naik and Astitva Srivastava and Kunwar Singh and Amit Raj and Varun Jampani and Avinash Sharma},
      year={2024},
      eprint={2401.03108},
      archivePrefix={arXiv},
      primaryClass={cs.CV} 
}