A data-driven characterisation of natural facial expressions when giving good and bad news
David M Watson,
Ben B Brown and
Alan Johnston
PLOS Computational Biology, 2020, vol. 16, issue 10, 1-22
Abstract:
Facial expressions carry key information about an individual’s emotional state. Research into the perception of facial emotions typically employs static images of a small number of artificially posed expressions taken under tightly controlled experimental conditions. However, such approaches risk missing potentially important facial signals and within-person variability in expressions. The extent to which patterns of emotional variance in such images resemble more natural ambient facial expressions remains unclear. Here we advance a novel protocol for eliciting natural expressions from dynamic faces, using a dimension of emotional valence as a test case. Subjects were video recorded while delivering either positive or negative news to camera, but were not instructed to deliberately or artificially pose any specific expressions or actions. A PCA-based active appearance model was used to capture the key dimensions of facial variance across frames. Linear discriminant analysis distinguished facial change determined by the emotional valence of the message, and this also generalised across subjects. By sampling along the discriminant dimension, and back-projecting into the image space, we extracted a behaviourally interpretable dimension of emotional valence. This dimension highlighted changes commonly represented in traditional face stimuli such as variation in the internal features of the face, but also key postural changes that would typically be controlled away such as a dipping versus raising of the head posture from negative to positive valences. These results highlight the importance of natural patterns of facial behaviour in emotional expressions, and demonstrate the efficacy of using data-driven approaches to study the representation of these cues by the perceptual system. The protocol and model described here could be readily extended to other emotional and non-emotional dimensions of facial variance.Author summary: Faces convey critical perceptual information about a person including cues to their identity, social traits, and their emotional state. To date, most research of facial emotions has used images of a small number of standardised facial expressions taken under tightly controlled conditions. However, such approaches risk missing potentially important facial signals and within-person variability in expressions. Here, we propose a novel protocol that allows the eliciting of emotional expressions under natural conditions, without requiring people to deliberately or artificially pose any specific facial expressions, by video recording people while they deliver statements of good or bad news. We use a model that captures the key dimensions of facial variability, and apply a machine learning algorithm to distinguish between the emotional expressions generated while giving good and bad news. By identifying samples along the discriminating dimension and projecting them back through the model into the image space, we can derive a behaviourally relevant dimension along which the faces appear to vary in emotional state. These results highlight the promise of data-driven techniques and the importance of employing natural images in the study of emotional facial expressions.
Date: 2020
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1008335 (text/html)
https://journals.plos.org/ploscompbiol/article/fil ... 08335&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pcbi00:1008335
DOI: 10.1371/journal.pcbi.1008335
Access Statistics for this article
More articles in PLOS Computational Biology from Public Library of Science
Bibliographic data for series maintained by ploscompbiol ().