Publication
Sensitivity-optimized Rigging for Example-based Real-time Clothing Synthesis
Abstract
We present a real time solution for generating detailed clothing animations from pre-computed example clothing drapes. Given an input pose, our method synthesizes a clothing drape by blending clothing deformations predicted from nearby examples. We show sensitivity analysis provides an optimal way for this prediction and blending procedure. Sensitivity-optimized rigging computes each example’s plausible clothing deformation as a rigged mesh. The rigging’s weights are optimized so that its linear responses agree with an equilibrium simulation under small perturbations of the example pose. This compact rigging scheme well models the global influence of the underline body motion to clothing deformation. We also develop a sensitivity optimized blending scheme for measuring the distance between poses according to their contribution to cloth deformation. For offline sampling, we propose a greedy scheme for sampling the pose space and computing example clothing drapes. Our solution is fast, compact and can generate physically plausible clothing animation results for various kinds of clothes in real time. We demonstrate the efficiency of our solution with results generated from different cloth types and body motions.
Download publicationRelated Resources
See what’s new.
2024
Handcrafting Non-uniform Grid Refinement for Modern GPUsLearn about a new methodology that carefully orchestrates the parallel…
1994
Contextual Animation of Gestural CommandsDrawing a mark can be an efficient command input technique when using…
2004
Ellis Auditorium: The Design of a Scalable, Fun and Beautiful, Socializing Webcast ExperienceToday’s commercial webcast applications are largely designed as…
2005
Towards integrated performance-driven generative design toolsPerformance-driven generative design methods are capable of producing…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us