Shape-invariant encoding of dynamic primate facial expressions in human perception

Research Area

Neural and Computational Principles of Action and Social Processing

Description

In this project we want to develop highly controllable face stimuli to study the neural basis of face processing and the analyses of the dynamics and structure of facial movements. For this purpose, we developed a computer graphics (CG) model of a monkey head based on MRI scans. The mesh is controlled by a ribbon-like muscle structure, linked to MoCap (Motion Capture) driven control points.

We have shown that Gaussian Process Dynamical Models (GPDM) are suitable for the generation of emotional motions, because they are able to convey very subtle style changes for the intendent emotion. We use GPDM with different emotional styles, which allows us to generate interpolated stimuli with exact control of the emotional style of the expression.

Psychophysics

Our studies investigated the perceptual representations of dynamic human and monkey facial expressions in human observers, exploiting photo-realistic human and monkey face avatars. The motion of the avatars was generated exploiting motion capture data of both primate species, which were used to compute the corresponding deformation of the surface mesh of the face.

In order to realize a full parametric control of motion style, we exploited the Bayesian motion morphing technique to create a continuous expression space that smoothly interpolates between human and monkey expressions. We used two human expressions and two monkey expressions as basic patterns, which represented corresponding emotional states (‘fear’ and ‘anger/threat’). Interpolating between these four prototypical motions in five equidistant steps, we generated a set of 25 facial movements that vary in five steps along two dimensions, the expression type, and the species.. Each generated motion pattern can be parameterized by a two-dimensional style vector (e, s), where the first component e specifies the expression type (e1⁄40: expression 1 (‘fear’) and e1⁄41: expression 2 (‘anger/threat’)), and where the second variable s defines the species-specificity of the motion (s 1⁄4 0: monkey and s 1⁄4 1: human).

 

 

Our result implies that primate facial expressions are perceptually encoded largely independently of the head shape (human vs. monkey) and of the stimulus view. Especially, this implies substantial independence of this encoding of the two-dimensional image features, which vary substantially between the view conditions, and even more between the human and the monkey avatar model.

Norm-reference Encoding

We propose a mechanism for the recognition of dynamic expressions inspired by results on the norm-referenced encoding of face identity by cortical neurons in area IT. We have proposed before a neural model that accounts for these electrophysiological results on the norm-referenced encoding. We demonstrate here that the same principles can be extended to account for the recognition of dynamic facial expressions. The idea of norm- referenced encoding is to represent the shape of faces in terms of differences relative to a reference face, where we assume that this is the shape of a neutral expression.

Interesting predictions emerged when the model is tested with the stimuli of variable expression strength, which were generated by morphing between the prototypes and neutral facial expressions. Here the Face neurons as well as the Expression neurons in the norm-based model show a gradual, almost linear variation of their activations with the expression level.

Publications

Siebert, R., Stettler, M., Taubert, N., Dicke, P., Giese, M. A. & Thier, P (2022). Encoding of dynamic facial expressions in the macaque superior temporal sulcus . Society for Neuroscience.
Encoding of dynamic facial expressions in the macaque superior temporal sulcus
Authors: Ramona Siebert Michael Stettler; Nick Taubert; Peter Dicke Martin A. Giese; Peter Thier
Type of Publication: In Collection
Publisher: Society for Neuroscience
Taubert, N., Stettler, M., Siebert, R., Spadacenta, S., Sting, L., Dicke, P. et al. (2021). Shape-invariant encoding of dynamic primate facial expressions in human perception. eLife.
Shape-invariant encoding of dynamic primate facial expressions in human perception
Abstract:

Dynamic facial expressions are crucial for communication in primates. Due to the difficulty to control shape and dynamics of facial expressions across species, it is unknown how species-specific facial expressions are perceptually encoded and interact with the representation of facial shape. While popular neural network models predict a joint encoding of facial shape and dynamics, the neuromuscular control of faces evolved more slowly than facial shape, suggesting a separate encoding. To investigate these alternative hypotheses, we developed photo-realistic human and monkey heads that were animated with motion capture data from monkeys and humans. Exact control of expression dynamics was accomplished by a Bayesian machine-learning technique. Consistent with our hypothesis, we found that human observers learned cross-species expressions very quickly, where face dynamics was represented largely independently of facial shape. This result supports the co-evolution of the visual processing and motor control of facial expressions, while it challenges appearance-based neural network theories of dynamic expression recognition.

Authors: Nick Taubert; Michael Stettler; R. Siebert S. Spadacenta L. Sting P. Dicke P. Thier Martin A. Giese
Research Areas: Uncategorized
Type of Publication: Article
Journal: eLife
Year: 2021
Month: June
Stettler, M., Taubert, N., Siebert, R., Spadacenta, S., Dicke, P., Thier, P. et al (2021). Neural models for the (cross-species) recognition of dynamic facial expressions. Göttingen Meeting of the German Neuroscience Society 2021, Germany .
Neural models for the (cross-species) recognition of dynamic facial expressions
Authors: Michael Stettler; Nick Taubert; Ramona Siebert Silvia Spadacenta Peter Dicke Peter Thier Martin A. Giese
Research Areas: Uncategorized
Type of Publication: In Collection
Full text: PDF
Stettler, M., Taubert, N., Azizpour, T., Siebert, R., Spadacenta, S., Dicke, P. et al. (2020). Physiologically-inspired Neural Circuits for the Recognition of Dynamic Faces. Artificial Neural Networks and Machine Learning – ICANN 2020 29th International Conference on Artificial Neural Networks, Bratislava, Slovakia, September 15–18, 2020, Proceedings, Part I. Springer, Berlin(168-179).
Physiologically-inspired Neural Circuits for the Recognition of Dynamic Faces
Authors: Michael Stettler; Nick Taubert; Tahereh Azizpour Ramona Siebert Silvia Spadacenta Peter Dicke Hans Peter Thier Martin A. Giese
Type of Publication: Article

Information

All images and videos displayed on this webpage are protected by copyright law. These copyrights are owned by Computational Sensomotorics.

If you wish to use any of the content featured on this webpage for purposes other than personal viewing, please contact us for permission.

Social Media

We use cookies

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.