Publication
Sensitivity-optimized Rigging for Example-based Real-time Clothing Synthesis
Abstract
We present a real time solution for generating detailed clothing animations from pre-computed example clothing drapes. Given an input pose, our method synthesizes a clothing drape by blending clothing deformations predicted from nearby examples. We show sensitivity analysis provides an optimal way for this prediction and blending procedure. Sensitivity-optimized rigging computes each example’s plausible clothing deformation as a rigged mesh. The rigging’s weights are optimized so that its linear responses agree with an equilibrium simulation under small perturbations of the example pose. This compact rigging scheme well models the global influence of the underline body motion to clothing deformation. We also develop a sensitivity optimized blending scheme for measuring the distance between poses according to their contribution to cloth deformation. For offline sampling, we propose a greedy scheme for sampling the pose space and computing example clothing drapes. Our solution is fast, compact and can generate physically plausible clothing animation results for various kinds of clothes in real time. We demonstrate the efficiency of our solution with results generated from different cloth types and body motions.
Download publicationRelated Resources
See what’s new.
2024
Highlights from our Interns: What They Loved about their InternshipsWe asked our summer internships to share what they most enjoyed about…
2024
Deep Dive on Project Phoenix in Industry PodcastIndustry podcast features in-depth conversation with David Benjamin…
2011
Biologically Meaningful Keywords for Functional Terms of the Functional BasisBiology is recognized as an excellent source of analogies and stimuli…
2014
Kitty: Sketching Dynamic and Interactive IllustrationsWe present Kitty, a sketch-based tool for authoring dynamic and…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us