Brain activation evoked by perception of gaze shifts: the influence of context

https://doi.org/10.1016/S0028-3932(02)00146-XGet rights and content

Abstract

Prior studies from our laboratory [Journal of Neuroscience 18 (1998) 2188; Cognitive Neuropsychology 17 (2000) 221] have demonstrated that discrete regions of the superior temporal sulcus (STS) are activated when a subject views a face in which the eyes shift their gaze. Here we investigated the degree to which activity in the STS and other brain regions is modulated by the context of the perceived gaze shift; that is, when the shift correctly or incorrectly acquires a visual target. Fifteen subjects participated in an event-related functional magnetic resonance imaging experiment in which they viewed an animated face that remained present throughout each run. On each of 21 trials within each run, a small checkerboard appeared and flickered at one of six locations within the character’s visual field. On “correct” trials, the character shifted its gaze towards the checkerboard after a delay of 1 or 3 s. On “incorrect” trials, the character shifted its gaze towards empty space after the same delays. On “no shift” trials, the character’s eyes did not move. Significantly larger hemodynamic responses (HDR) were evoked by gaze shifts compared to no gaze shifts in primarily right hemisphere STS. The gaze-evoked HDR was significantly delayed in peak amplitude for 3 s compared to 1 s shifts. For 1 s shifts, a strong effect of context was observed in which errors evoked a HDR with extended duration. Although this study focused upon STS, similar effects were also observed in the intraparietal sulcus and fusiform gyrus.

Introduction

From the earliest stages of postnatal development, faces are salient to typically developing individuals [20], [37]. Faces derive their significance, in part, from the wealth of social information they provide. This information includes the bearer’s identity [9], emotional state [6], [16], intentions [4], [5], and focus of attention [41], [42]. The capacity to extract socially relevant information from faces is fundamental to normal reciprocal social interactions and interpersonal communication. Of the core internal facial features (i.e. eyes, nose, and mouth), the eyes are thought to provide the most critical information and preferentially draw a viewer’s attention [17], [42]. Adult viewers devote 70% or more of their fixations to the eyes [43], [49], [65]. This pattern of face scanning emerges as early as the second month of postnatal life [25], [45], and is disturbed in schizophrenia [53] and autism [50]. Information regarding direction of gaze is thought to be particularly important in guiding social interactions [4], [40]. Gaze can provide information concerning the mental states of others, facilitate social control, regulate turn taking, direct attention, and communicate intimacy [3], [17], [40]. Sensitivity to gaze direction emerges early in ontogeny. For example, infants detect direction of perceived gaze, and modulate their own attention accordingly [18], [32], [60], [63].

Recent neurofunctional models of the human face processing system distinguish cortical regions involved in processing invariant (i.e. those that carry information about identity) characteristics of faces from those regions involved in processing dynamic (i.e. those that facilitate communication) aspects of faces [27], [28], [46], [55]. For example, McCarthy [46] identified four nodes of the human face processing system. Two of these nodes, the lateral posterior fusiform gyrus (FFG) and anterior ventral temporal cortex, are involved, respectively, in structural encoding and face memory. A third node, centered in the superior temporal sulcus (STS), is involved in the analysis of face motion, such as eye and mouth movements. The remaining node, located in the amygdala, is involved in the analysis of facial expression.

Allison et al. [1] used the term “STS region” to refer to cortex within the STS, to adjacent cortex on the surface of the superior temporal gyrus and middle temporal gyrus (near the straight segment of the STS), and to adjoining cortex on the surface of the angular gyrus (near the ascending limb of the STS). Several sources of evidence have converged to indicate that the STS region is involved in the perception of gaze direction. This role was suggested initially by experimental studies of nonhuman primates [11], [26], [30], [51], [52], [67] and neuropsychological studies of human lesion patients [11]. More recently, functional neuroimaging and electrophysiology studies have started to enhance our knowledge of the STS region’s involvement in processing gaze direction.

Using functional magnetic resonance imaging (fMRI), Puce et al. [55] first identified a bilateral region of activation centered in the posterior STS in response to observed eye or mouth movements, but not in response to an inwardly moving radial pattern presented to control for effects related to movement per se. With event-related potential (ERP) recordings, Bentin et al. [7] demonstrated that an N170 ERP recorded from scalp electrodes overlying the STS was larger when evoked by isolated eyes than by whole faces or other face components, and Puce et al. [56] demonstrated that the N170 ERP was larger in response to the movement of eyes averting their gaze away from the viewer than to eyes returning to gaze at the observer. In a positron emission tomography (PET) study, Wicker et al. [66] identified several regions of activation in response to mutual and averted gaze including portions of the STS. Finally, using fMRI, Hoffman and Haxby [31] demonstrated that attention to gaze elicited a stronger response in STS than did attention to identity. Note that some of these studies used static stimuli that varied in direction of gaze [10], [31], [39], [66] while others used dynamic stimuli in which the eyes moved [55], [56].

Research concerning the role of the STS region in processing eye movements is fundamental to our understanding of the neurofunctional organization of the human face processing system. However, this line of inquiry is equally significant for its potential to provide information about the neuroanatomical systems underlying social perception and social cognition [8], [19]. Allison et al. [1] defined social perception as the initial stages of evaluating the intentions of others by analysis of gaze direction, body movement, and other types of biological motion, and stressed the role of the STS region in a larger social perception system involved in processing the emotional value and social significance of biological stimuli.

Baron-Cohen [4] has defined a four-component neuropsychological model of a “mindreading” system, whereby we acquire information from the face of another person during shared attention, use this information to attribute a mental state to the person, and then predict that individual’s behavior from his or her inferred mental state. In Baron-Cohen’s model, the “intentionality detector” (ID) detects self-propelled moving stimuli and allows us to interpret their movement in terms of simple volitional mental states (e.g. goals and desires). An “eye-direction detector” (EDD) perceives the presence of eyes and the direction of their gaze and attributes the mental state of seeing to the owner of those eyes. These two components are linked together by a “shared attention module” (SAM), which supports the identification of occasions when the self and another agent are attending to the same stimulus. Lastly, by integrating data from the three previous components, the “theory-of-mind mechanism” (ToMM) provides the ability to examine information gathered from another individual during shared attention, allows us to ascribe a mental state to that individual, and then permits us to explain or predict that individual’s behavior in terms of the inferred mental state. Information concerning the role of the STS region in processing eye movements is particularly significant for Baron-Cohen’s model, because two components of the mindreading system, the EDD and the SAM rely on normal gaze shift perception, and the ToMM, in turn, relies upon data from these two components.

In prior functional neuroimaging studies concerned with gaze perception, the stimulus face gazed towards empty space [31], [55], [66]. Thus, it is not clear whether the identified brain regions participated merely in simple gaze detection or in a more complex analysis related to the context in which the gaze shift occurred. Here, by providing a target for the gaze shift, we investigated whether regions activated by the perception of gaze are modulated by the context of the observed gaze shift. Participants observed an animated female character as visual target appeared within the character’s visual field at regular intervals. The character either made no gaze shift to the target, shifted gaze to the target with a 1 or 3 s delay, or shifted gaze to an empty location of space with the same delay. This allowed us to determine whether activity within the face processing system is influenced by the perceived intention or goal of the action, and whether a gaze shift toward an object produces a different pattern of activity than that of an identical gaze toward empty space. That is, we were interested in determining if elements of the face processing system are sensitive to the social relevance of a biological motion—whether the action is intentional and goal-directed within the established context.

In addition to investigating context-dependent activity within the STS, we also wished to examine the effects of perceived gaze in other brain regions. In a similarly designed pilot study with eight subjects, we observed, in addition to STS activity, activations related to gaze perception in the intraparietal sulcus (IPS) and FFG. However, we employed a constant 1 s delay between the appearance of the target and the gaze shift, and so it was uncertain whether the activity was related to processing the visual target or the gaze shift. We predicted that varying the delay interval between target and gaze would result in a systematic change in the latency to peak amplitude of the gaze-related hemodynamic response (HDR).

Finally, we were concerned that any observed differences in activity might not result from true differences in gaze processing, but rather from differences in the way that participants viewed the stimuli. For example, participants might move their own eyes more in one condition than another, and this differential eye movement might be related to a participant’s experience with the task. We therefore conducted a parallel study outside of the scanner in which we recorded the visual scanpaths of naı̈ve and experienced volunteers in response to the stimuli used in the current fMRI study. The point-of-regard (POR) recordings allowed us to address these two potential confounds.

Section snippets

Participants

Fifteen healthy right-handed volunteers participated in the fMRI experiment (8 male, 7 female; age range 19–30 years; mean age 24.2 years). Eight healthy right-handed volunteers (4 females; age range 22–33; average age 27.1 years) participated in the eye movement monitoring study. Of the eight subjects in this latter study, four also participated in the fMRI experiment (2 females). All participants had normal or corrected to normal visual acuity. The studies were approved by the Institutional

Eye movement results

The results of a 2 (experience) × 5 (stimulus condition) mixed ANOVA confirmed that the average amounts of eye movements did not differ by stimulus condition or experience with the stimuli. The interaction between these two factors was not significant. These results indicate that differences in functional activation by stimulus conditions cannot be explained by disparities in the total amount of eye movements made by the subjects during stimulus viewing. The absence of a difference in the

Discussion

The current findings confirm the results of previous studies that reported activation in discrete brain regions elicited by the perception of a gaze shift. The present research extends prior work by demonstrating that the context in which an eye movement occurs modulates activity in brain regions associated with gaze shift perception. The pattern of results observed in portions of the STS, IPS, and FFG was similar, and thus a single explanation may account for activation in all of these

Acknowledgements

We are grateful to Jeremy Goldstein, Lilly Kinross-Wright, Karen Emberger, Sarah Hart, and Ronald Viola for assistance in data acquisition and analysis and manuscript preparation. We thank Dr. Martin J. McKeown for developing the software used in aligning each subject’s anatomical images to a common space. We thank Dr. Gary Glover of Stanford University for providing source code for the spiral pulse sequence. We thank Drs. Allen Song and James Voyvodic for assistance with several aspects of

References (67)

  • T. Jellema et al.

    Neural representation for the perception of the intentionality of actions

    Brain and Cognition

    (2000)
  • M.H. Johnson et al.

    Newborns’ preferential tracking of face-like stimuli and its subsequent decline

    Cognition

    (1991)
  • S.R. Langton et al.

    Do the eyes have it? Cues to the direction of social attention

    Trends in Cognitive Science

    (2000)
  • M.L. Phillips et al.

    Viewing strategies for simple and chimeric faces: an investigation of perceptual bias in normals and schizophrenic patients using visual scan paths

    Brain and Cognition

    (1997)
  • G. Rizzolatti et al.

    Premotor cortex and the recognition of motor actions

    Brain Research and Cognitive Brain Research

    (1996)
  • C. Senior et al.

    The functional neuroanatomy of implicit-motion perception or representational momentum

    Current Biology

    (2000)
  • J.T. Voyvodic

    Real-time fMRI integrating paradigm control, physiology, behavior, and on-line statistical analysis

    NeuroImage

    (1999)
  • B. Wicker et al.

    Brain regions involved in the perception of gaze: A PET study

    NeuroImage

    (1998)
  • T. Allison et al.

    Electrophysiological studies of human face perception. I. Potentials generated in occipitotemporal cortex by face and non-face stimuli

    Cerebral Cortex

    (1999)
  • Argyle M, Cook M. Gaze and mutual gaze. Cambridge University Press: New York;...
  • Baron-Cohen S. Mindblindness: an essay on autism and theory-of-mind. MIT Press: Cambridge (MA);...
  • S. Baron-Cohen et al.

    The reading the mind in the eyes test revised version: a study with normal adults, and adults with Asperger syndrome or high-functioning autism

    Journal of Child Psychology and Psychiatry

    (2001)
  • Bassili JN. On-line cognition in person perception. Lawrence Erlbaum: Hillsdale (NJ);...
  • L. Brothers

    The social brain: a project for integrating primate behavior and neurophysiology in a new domain

    Concepts in Neuroscience

    (1990)
  • V. Bruce et al.

    Understanding face recognition

    British Journal of Psychology

    (1986)
  • M. Corbetta et al.

    Human cortical mechanisms of visual attention during orienting and search

    Philosophical Transactions of the Royal Society of London—Series B: Biological Sciences

    (1998)
  • M. Corbetta et al.

    Superior parietal cortex activation during spatial attention shifts and visual feature conjunction

    Science

    (1995)
  • Duvernoy HM. The human brain: surface, three-dimensional sectional anatomy with MRI, and blood supply. Springer-Wien:...
  • Ekman P. Emotion in the human face. Cambridge University Press: New York;...
  • T. Farroni et al.

    Infants’ use of gaze direction to cue attention: the importance of perceived motion

    Visual Cognition

    (2000)
  • C.D. Frith et al.

    Interacting minds—a biological basis

    Science

    (1999)
  • C.C. Goren et al.

    Visual following and pattern discrimination of face-like stimuli by newborn infants

    Pediatrics

    (1975)
  • S.T. Grafton et al.

    Localization of grasp representations in humans by positron emission tomography. 2. Observation compared with imagination

    Experimental Brain Research

    (1996)
  • Cited by (280)

    • Animacy and the prediction of behaviour

      2022, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      As has been observed before (Puce et al., 1998), the moving eyes, an example of biological motion, elicited activity in pSTS. However, movement away from the target elicited greater activity in pSTS (and also AIPS) than movements toward the target (Pelphrey et al., 2003). The same paradigm was repeated except that grasping movements rather than eye movements were displayed.

    View all citing articles on Scopus
    View full text