Modelling and Investigation of Facial Expression Perception
Research Area:Neural and Computational Principles of Action and Social Processing
Researchers:Martin A. Giese; Nick Taubert; Michael Stettler;
Collaborators:Aleix Martinez (Ohio State University); Peter Thier; Peter Dicke; Silvia Spadacenta; Ramona Siebert; Marius Görner
Dynamic faces are essential for the communication of humans and non-human primates. However, the exact neural circuits of their processing remain unclear. Based on previous models for cortical neural processes involved for social recognition (of static faces and dynamic bodies), we propose a norm-based mechanism, relying on neurons that represent dierences between the actual facial shape and the neutral facial pose. While popular neural network models predict a joint encoding of facial shape and dynamics, the neuromuscular control of faces evolved more slowly than facial shape, suggesting a separate encoding. To investigate these alternative hypotheses, we developed photo-realistic human and monkey heads that were animated with motion capture data from monkeys and humans. Exact control of expression dynamics was accomplished by a Bayesian machine-learning technique.
In this project we want to develop highly controllable face stimuli to study the neural basis of face processing and the analyses of the dynamics and structure of facial movements.
Biologically-inspired mechanism for such transfer learning, which is based on norm-referenced encoding, where patterns are encoded in terms of difference vectors relative to a domain-specific reference vector.