AgentDress : Realtime Clothing Synthesis for Virtual Agents using Plausible Deformations

We present a CPU-based real-time cloth animation method for dressing virtual humans of various shapes and poses. Our approach formulates the clothing deformation as a high-dimensional function of body shape parameters and pose parameters. In order to accelerate the computation, our formulation facto...

Description complète

Détails bibliographiques
Publié dans:IEEE transactions on visualization and computer graphics. - 1996. - 27(2021), 11 vom: 15. Nov., Seite 4107-4118
Auteur principal: Wu, Nannan (Auteur)
Autres auteurs: Chao, Qianwen, Chen, Yanzhen, Xu, Weiwei, Liu, Chen, Manocha, Dinesh, Sun, Wenxin, Han, Yi, Yao, Xinran, Jin, Xiaogang
Format: Article en ligne
Langue:English
Publié: 2021
Accès à la collection:IEEE transactions on visualization and computer graphics
Sujets:Journal Article Research Support, Non-U.S. Gov't
Description
Résumé:We present a CPU-based real-time cloth animation method for dressing virtual humans of various shapes and poses. Our approach formulates the clothing deformation as a high-dimensional function of body shape parameters and pose parameters. In order to accelerate the computation, our formulation factorizes the clothing deformation into two independent components: the deformation introduced by body pose variation (Clothing Pose Model) and the deformation from body shape variation (Clothing Shape Model). Furthermore, we sample and cluster the poses spanning the entire pose space and use those clusters to efficiently calculate the anchoring points. We also introduce a sensitivity-based distance measurement to both find nearby anchoring points and evaluate their contributions to the final animation. Given a query shape and pose of the virtual agent, we synthesize the resulting clothing deformation by blending the Taylor expansion results of nearby anchoring points. Compared to previous methods, our approach is general and able to add the shape dimension to any clothing pose model. Furthermore, we can animate clothing represented with tens of thousands of vertices at 50+ FPS on a CPU. We also conduct a user evaluation and show that our method can improve a user's perception of dressed virtual agents in an immersive virtual environment (IVE) compared to a realtime linear blend skinning method
Description:Date Completed 13.01.2022
Date Revised 13.01.2022
published: Print-Electronic
Citation Status MEDLINE
ISSN:1941-0506
DOI:10.1109/TVCG.2021.3106429