Probabilistic models for the online-synthesis of emotional and interactive full-body motion
Research Area:
Biomedical and Biologically Motivated Technical ApplicationsResearchers:
Martin A. Giese; Nick Taubert; ; Andrea Christensen; Albert MukovskiyDescription:
Generative probabilistic models of interactive and stylized human motion are applicable in a variety of fields. On the technical side, such models are useful in computer animation, or motion recognition and emotional feature analysis (when the generative model is inverted), thereby facilitating more human-friendly human-computer interaction. But they are also applied for the generation of well-controlled stimuli for experiments in psychology and neuroscience. The interactivity of such stimuli is an important aspect, since for example, it has been shown that personal involvement established by direct eye contact with an emotional second agent can alter neurophysiological responses. Furthermore, interactive and emotionally stylized stimuli might are interesting for studying changes of emotion perception and emotional interaction in specific patient groups, such as autism, schizophrenia, or various affective diseases. The developed methodology allows here to go going beyond simple passive recognition studies.
We developed a real-time capable system for the simulation of highly-realistic interactive emotional body movements, i.e. the system generates movements that react to the body movements of the observer. The system is based on a hierarchical architexture ('deep learning architecture') that integrates Gaussian processes (GP) and Gaussian Process Dynamical Models (GPDM). Such models allow for the approximation of complex trajectories with high accuracy, at the same time guaranteeing successful generalization from few training examples.
Subprojects
Embodiment theories hypothesize that the perception of emotions from body movements involves an activation of brain structures that are involved in motor execution during social interaction. We test this hypothesis using a VR setup, exploiting the realtime syntheiss of interactive emotional body movement.
Modeling the conditional dependencies induced by the coordinated movements of multiple actors / agents in an interactive setting.