Publication
Sensitivity-optimized Rigging for Example-based Real-time Clothing Synthesis
Abstract
We present a real time solution for generating detailed clothing animations from pre-computed example clothing drapes. Given an input pose, our method synthesizes a clothing drape by blending clothing deformations predicted from nearby examples. We show sensitivity analysis provides an optimal way for this prediction and blending procedure. Sensitivity-optimized rigging computes each example’s plausible clothing deformation as a rigged mesh. The rigging’s weights are optimized so that its linear responses agree with an equilibrium simulation under small perturbations of the example pose. This compact rigging scheme well models the global influence of the underline body motion to clothing deformation. We also develop a sensitivity optimized blending scheme for measuring the distance between poses according to their contribution to cloth deformation. For offline sampling, we propose a greedy scheme for sampling the pose space and computing example clothing drapes. Our solution is fast, compact and can generate physically plausible clothing animation results for various kinds of clothes in real time. We demonstrate the efficiency of our solution with results generated from different cloth types and body motions.
Download publicationRelated Resources
See what’s new.
2016
Integrated Spatial-Structural Optimization in the Conceptual Design Stage of ProjectHealthcare design projects require the careful integration of spatial…
2006
ShowMotion: Camera Motion based 3D Design ReviewWe describe a new interactive system for 3D design review, built to…
2013
Net Promoter Scores and the Value of a Good User ExperienceAt Autodesk we’ve been using the Net Promoter method to analyze user…
2001
A Unified Subdivision Scheme for Polygonal ModelingSubdivision rules have traditionally been designed to generate smooth…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us