Component-based Trajectory Models for Human Character Animation
Research Area:Biomedical and Biologically Motivated Technical Applications
Researchers:Martin A. Giese; Albert Mukovskiy;
Collaborators:Lars Omlor; A. Park
The efficient parameterization of complex human movements is a core problem of modern computer animation. For the synthesis of animations with a high degree of realism learning-based approaches have become increasingly popular. Most of these approaches approximate style-parameterized trajectories based on motion capture data. The size of the required data bases depends critically on the efficiency of the chosen trajectory representation. One characteristics of efficiency is that such representations should allow the synthesis of large trajectory classes with a very limited amount of motion capture data.
To develop efficient representations of motion data we take an approach that is inspired by concepts from motor control in biological systems. The control of complex multi-joint movements in biological systems likely is based on the superposition of lower-dimensional components (synergies). Previous work in neuroscience shows that such components can be extracted from biological data, and that they can be parameterized in the framework of dynamical systems.
Inspired by results from Motor Control we solve the degrees of freedom problem in full-body animation with an algorithm that comprises the following steps:
We learn movement primitives from a set of MoCap data applying a new algorithm for blind source separation that models the data by mixtures of delayed sources (anechoic mixtures). These extracted source signals (or synergies) form the basis of a generative trajectory model.
To obtain a real-time capable animation system we define nonlinear dynamical systems that generate the associated trajectories online. This is done by constructing a nonlinear mapping between the attractor solutions of the dynamical systems (e.g. VdP oscillators) and the source signals using Support Vector Regression (SVR).
The dynamical systems that correspond to different sources are coupled in order to stabilize a coordinated behavior between different synergies.
The introduction of appropriate dynamic couplings between the dynamical systems controlling individual avatars allows the realization of interactive behavior, like following behavior or the synchronization of the rhythmic movements. In addition different styles can be animated by morphing between different emotions and movement styles using linear interpolation methods.
A navigation dynamics is implemented by specifying an appropriate dynamical system for the heading direction. Curved walking is animated by online blending between examples of curved and straight walking.