Elsevier

NeuroImage

Volume 54, Issue 2, 15 January 2011, Pages 1755-1762
NeuroImage

Similarities and differences in perceiving threat from dynamic faces and bodies. An fMRI study

https://doi.org/10.1016/j.neuroimage.2010.08.012Get rights and content

Abstract

Neuroscientific research on the perception of emotional signals has mainly focused on how the brain processes threat signals from photographs of facial expressions. Much less is known about body postures or about the processing of dynamic images. We undertook a systematic comparison of the neurofunctional network dedicated to processing facial and bodily expressions. Two functional magnetic resonance imaging (fMRI) experiments investigated whether areas involved in processing social signals are activated differently by threatening signals (fear and anger) from facial or bodily expressions. The amygdala (AMG) was more active for facial than for bodily expressions. Body stimuli triggered higher activation than face stimuli in a number of areas. These were the cuneus, fusiform gyrus (FG), extrastriate body area (EBA), temporoparietal junction (TPJ), superior parietal lobule (SPL), primary somatosensory cortex (SI), as well as the thalamus. Emotion-specific effects were found in TPJ and FG for bodies and faces alike. EBA and superior temporal sulcus (STS) were more activated by threatening bodies.

Research Highlights

►The amygdala was more active for facial than for bodily expressions. ►Body stimuli triggered higher activation in the cuneus, fusiform gyrus, extrastriate body area, temporoparietal junction, superior parietal lobule, primary somatosensory cortex, and thalamus. Emotion specific effects were found for TPJ and FG for bodies and faces alike. ►EBA and STS were more activated by threatening bodies.

Introduction

Perception of bodies and bodily expressions is a relatively novel topic in affective neuroscience, a field dominated so far by investigations of facial expressions. But faces and bodies are equally salient and familiar in daily life and often convey the same information about identity, emotion and gender. Therefore, it seems natural to expect that many of the same research questions arise about both (de Gelder, 2006, de Gelder et al., 2010). On the other hand, differences in the neural basis of body and face processing may be as interesting as the similarities. The goal of our study was to further our understanding of both by systematically comparing facial and bodily expressions of the same emotions.

The neural network underlying face perception is well known and includes the fusiform face area (FFA) (Kanwisher et al., 1997), the occipital face area (OFA) (Gauthier et al., 2000, Puce et al., 1996), the STS and the AMG (Haxby et al., 2000). Recent studies indicate that the neural network underlying whole body perception partly overlaps with the face network (de Gelder, 2006, de Gelder et al., 2010, Peelen et al., 2007). But so far, the few direct comparisons have used static images (Meeren et al., 2008, Van de Riet et al., 2009). These studies mainly confirm the involvement of AMG, FG, and STS in face and body perception. Furthermore, it remains unclear how activity in these regions is influenced by dynamic information. Static body pictures may imply motion, but explicit movement information in dynamic stimuli may activate a richer and partly different, broader network.

Recent studies with dynamic stimuli have proven useful for better understanding the respective contribution of action and emotion-related components. A study by Grosbras and Paus (2006) showed that video clips of angry hands trigger activations that largely overlap with those reported for facial expressions in the FG. Increased responses in STS and TPJ have been reported for dynamic threatening body expressions (Grèzes et al., 2007, Pichon et al., 2008, Pichon et al., 2009). Whereas TPJ is implicated in higher level social cognitive processing (Decety and Lamm, 2007), STS has been frequently highlighted in biological motion studies (Allison et al., 2000) and shows specific activity for goal-directed actions and configural and kinematic information from body movements (Bonda et al., 1996, Grossman and Blake, 2002, Perrett et al., 1989, Thompson et al., 2005).

There are also some currently unanswered questions about the functional role of body and face selective areas. A body-sensitive area in the extra striate cortex (EBA) was first reported by Downing et al. (2001). Its role in processing dynamic stimuli and affective valence is not yet clear. Urgesi et al. (2007) attribute featural but not configural processing to EBA (see also Taylor et al., 2007, Hodzic et al., 2009). Previous studies using static stimuli failed to find evidence for emotion modulation (de Gelder et al., 2004, Lamm and Decety, 2008, Van de Riet et al., 2009), but studies of dynamic bodily expressions show that EBA is sensitive to affective information conveyed by the body stimulus (Grèzes et al., 2007, Peelen, et. al., 2007, Pichon et al., 2008). This modulation by emotion may be compatible with EBA as a feature processor, in which case one would need to investigate which specific body part conveys the affective information. Alternatively, EBA does in fact process the configuration of the stimulus. This alternative is consistent with our findings that EBA is differentially sensitive to affective information in the body when videos are used. Originally, Hadjikhani and de Gelder (2003) compared neutral bodies and fear bodily expressions and reported sensitivity for fear bodies in FG. Consistent with this body sensitivity of FG, a later study using neutral bodies, defined a body-sensitive area in the FG labeled the fusiform body area (FBA) (Peelen and Downing, 2005). The role of the EBA and FG in emotional processing has not been fully understood yet, and it is too early to claim that EBA is specifically sensitive for bodily features and less or not sensitive to the configural representation of a body. The use of dynamic emotional stimuli and a direct comparison with facial expressions is likely to provide new insights in this matter.

We used fMRI to measure participants' haemodynamic brain activity while they were watching videos showing fearful, angry or neutral facial or bodily expressions. A major goal was to clarify the sensitivity of AMG, FG, EBA, STS and TPJ for affective valence of whole bodies and of faces. We used a ROI procedure to localize each of these regions. We predicted an increased BOLD response in these areas for facial and bodily expressions of emotion compared to neutral faces and bodies. A second goal was to clarify the emotion -sensitivity of EBA. Since studies that use dynamic stimuli find emotional modulation in this area, we expected to find this area especially active for threatening body expressions.

Section snippets

Participants

Twenty-eight (14 females, mean age 19.8 years old, range 18–27 years old; 14 males; mean age: 21.6 years old, range 18–32 years old) took part in the experiment. Half of the participants viewed neutral and angry expressions and the other half viewed neutral and fearful expressions. Participants had no neurological or psychiatric history, were right-handed and had normal or corrected-to-normal vision. All gave informed consent. The study was performed in accordance to the Declaration of Helsinki and

Bodies vs. faces

The conjunction between body vs. face [(anger + neutral (BO vs. FA)] and [fear + neutral (BO vs. FA)] yielded a large increase of activity in both hemispheres including the cuneus, middle occipital/temporal gyrus, inferior temporal gyrus, and TPJ extending to the paracentral lobule and the posterior cingulate gyrus. This cluster included the FBA, EBA, and STS regions that were found in the localizer experiment. Other areas included the supramarginal gyrus, superior parietal lobule, left thalamus,

Discussion

Our comparative study of the neurofunctional basis of perceiving video clips of facial and bodily expressions of threat (fear and anger) reveals similarities as well as differences between the neural basis of facial and bodily expression perception. The first major finding is that the AMG is more active for facial than for bodily expressions, but independently of the facial emotion. Secondly, a number of areas show higher activation for bodies than for faces. These are the cuneus, FG, EBA, TPJ,

Acknowledgments

Research was supported by Nederlandse Organisatie voor Wetenschappelijk Onderzoek (NWO, 400.04081), Human Frontiers Science Program RGP54/2004, and European Commission (COBOL FP6-NEST-043403) grants.

References (62)

  • A. Hodzic et al.

    Distinct cortical networks for the detection and identification of human body

    Neuroimage

    (2009)
  • K.L. Hoffman et al.

    Facial-expression and gaze-selective responses in the monkey amygdala

    Curr. Biol.

    (2007)
  • R. Hurlemann et al.

    Segregating intra-amygdalar responses to dynamic facial emotion with cytoarchitectonic maximum probability maps

    J. Neurosci. Methods

    (2008)
  • P.L. Jackson et al.

    Empathy examined through the neural mechanisms involved in imagining how I feel versus how you feel pain: an event-related fMRI study

    Neuropsychologia

    (2006)
  • C.D. Kilts et al.

    Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions

    Neuroimage

    (2003)
  • A. Kling et al.

    Radiotelemetered activity from the amygdala during social interactions in the monkey

    Exp. Neurol.

    (1979)
  • E.J. Lawrence et al.

    The role of ‘shared representations’ in social perception and empathy: an fMRI study

    Neuroimage

    (2006)
  • S. Pichon et al.

    Two different faces of threat. Comparing the neural systems for recognizing fear and anger in dynamic body expressions

    Neuroimage

    (2009)
  • W. Sato et al.

    Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study

    Cogn. Brain Res.

    (2004)
  • R. Saxe et al.

    Making sense of another mind: the role of the right temporo-parietal junction

    Neuropsychologia

    (2005)
  • D. Simon et al.

    Brain responses to dynamic facial expressions of pain

    Pain

    (2006)
  • J.C. Thompson et al.

    Common and distinct brain activation to viewing dynamic sequences of face and hand movements

    Neuroimage

    (2007)
  • S.A. Trautmann et al.

    Emotions in motion: dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations

    Brain Res.

    (2009)
  • K.J. Wheaton et al.

    Viewing the motion of human body parts activates different regions of premotor, temporal, and parietal cortex

    Neuroimage

    (2004)
  • E. Bonda et al.

    Specific involvement of human parietal systems and the amygdala in the perception of biological motion

    J. Neurosci.

    (1996)
  • B. de Gelder

    Towards the neurobiology of emotional body language

    Nat. Rev. Neurosci.

    (2006)
  • B. de Gelder et al.

    Fear fosters flight: a mechanism for fear contagion when perceiving emotion expressed by a whole body

    Proc. Natl Acad. Sci. USA

    (2004)
  • J. Decety et al.

    The role of the right temporoparietal junction in social interaction: how low-level computational processes contribute to meta-cognition

    Neuroscientist

    (2007)
  • P.E. Downing et al.

    A cortical area selective for visual processing of the human body

    Science

    (2001)
  • H.M. Duvernoy

    The Human Brain: Surface, Three-dimensional Sectional Anatomy with MRI, and Blood Supply

    (1999)
  • K. Friston et al.

    Statistical parametric maps in functional imaging: a general linear approach

    Hum. Brain Mapp.

    (1995)
  • Cited by (154)

    View all citing articles on Scopus
    View full text