Animating Manipulation Tasks from Human Motion Data
Captured human motion data has been used to create convincing human motion for a wide variety of tasks including locomotion, sports, and dance. However, generalizing motion capture data to new situations remains difficult, and tasks that involve manipulating objects in the virtual world are rarely addressed. This talk considers the problem of adapting a motion capture data for a quasistatic manipulation task to new object geometries and friction conditions. I will argue that correct physics is critical for this type of task and will present an algorithm that generates manipulation plans for new objects by using the captured motion data to constrain the solution space to a set of plans physically similar to the original. Our planner provides guarantees on maximum task forces and flexibility in contact placement and works for a wide range of object geometries and coefficients of friction. I will show results for the task of tumbling large, heavy objects. We have demonstrated these results with a humanoid robot as well as with animated characters.