15
Research Report Interference control during recognition of facial affect enhances the processing of expression specific properties An event-related fMRI study Sascha Frühholz a,b, , Thorsten Fehr a,b , Manfred Herrmann a,b a Department of Neuropsychology and Behavioral Neurobiology, Center for Cognitive Sciences, University of Bremen, Germany b Center for Advanced Imaging (CAI, Bremen), University of Bremen, Germany ARTICLE INFO ABSTRACT Article history: Accepted 6 March 2009 Available online 21 March 2009 Though we can almost pre-attentively categorize the valence of facial expressions, we experience emotional ambiguity when confronted with facial expressions in a context with incongruent emotional information. We simultaneously presented interfering background colors during forced-choice categorizations of negative (fear), neutral and positive (happy) expressions. Conflicting information induced strong and differential interference effects on a behavioral level which was mirrored in comparable activations on a neuronal level. Besides a common fronto-parietal attention network which was activated during interference resolution, we found differential interference effects for facial expressions. Incongruent trials with neutral expressions induced a distinct activation pattern in ventral visual regions particularly involved in deeper analysis for both the task-relevant facial expressions (fusiform (FFA) and occipital face area (OFA)) and the task-irrelevant color (V4). Compared to neutral expressions, incongruent trials including either negative or positive expressions elicited attenuated interference effects. Unlike incongruent trials with positive facial expressions which showed only sparse activation in frontal cortex, interference resolution during processing of negative facial expressions resulted in specific activations in regions (V3a, MT + , STS) which might be involved in processing of implicit dynamics of negative expressions. Thus, functional activations in visual processing regions might specifically be related to processing demands of different expressions. © 2009 Elsevier B.V. All rights reserved. Keywords: Emotional interference Facial expression Contextual distraction Interference resolution fMRI 1. Introduction From an ecological perspective, facial expressions always appear in a broader temporal and spatial context. Contextual features are often relevant for a unique interpretation of facial expressions, but emotional incongruent contextual features might also introduce ambiguity during recognition of facial expressions. Several studies by de Gelder and her colleagues (Meeren et al., 2005; Righart and de Gelder, 2006) demonstrated that different components of event-related potentials during emo- tional face processing are modulated by contextual stimulus features. These authors, for instance, demonstrated a modula- tion of the visual P1 component by emotional incongruent compounds of facial expressions and emotional body postures (Meeren et al., 2005). Additionally, a modulation of the face- BRAIN RESEARCH 1269 (2009) 143 157 Corresponding author. Department of Neuropsychology and Behavioral Neurobiology, Center for Cognitive Sciences (ZKW), University of Bremen, Cognium, Hochschulring 18, 28359 Bremen, Germany. Fax: +49 421 218 68759. E-mail address: [email protected] (S. Frühholz). 0006-8993/$ see front matter © 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.brainres.2009.03.017 available at www.sciencedirect.com www.elsevier.com/locate/brainres

Interference control during recognition of facial affect enhances the processing of expression specific properties — An event-related fMRI study

Embed Size (px)

Citation preview

B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

ava i l ab l e a t www.sc i enced i r ec t . com

www.e l sev i e r. com/ l oca te /b ra in res

Research Report

Interference control during recognition of facial affectenhances the processing of expression specificproperties — An event-related fMRI study

Sascha Frühholza,b,⁎, Thorsten Fehra,b, Manfred Herrmanna,b

aDepartment of Neuropsychology and Behavioral Neurobiology, Center for Cognitive Sciences, University of Bremen, GermanybCenter for Advanced Imaging (CAI, Bremen), University of Bremen, Germany

A R T I C L E I N F O

⁎ Corresponding author. Department of NeuroBremen, Cognium, Hochschulring 18, 28359

E-mail address: [email protected]

0006-8993/$ – see front matter © 2009 Elsevidoi:10.1016/j.brainres.2009.03.017

A B S T R A C T

Article history:Accepted 6 March 2009Available online 21 March 2009

Though we can almost pre-attentively categorize the valence of facial expressions, weexperience emotional ambiguity when confronted with facial expressions in a context withincongruent emotional information. We simultaneously presented interfering backgroundcolors during forced-choice categorizations of negative (fear), neutral and positive (happy)expressions. Conflicting information induced strong and differential interference effects ona behavioral level which was mirrored in comparable activations on a neuronal level.Besides a common fronto-parietal attention network which was activated duringinterference resolution, we found differential interference effects for facial expressions.Incongruent trials with neutral expressions induced a distinct activation pattern in ventralvisual regions particularly involved in deeper analysis for both the task-relevant facialexpressions (fusiform (FFA) and occipital face area (OFA)) and the task-irrelevant color (V4).Compared to neutral expressions, incongruent trials including either negative or positiveexpressions elicited attenuated interference effects. Unlike incongruent trials with positivefacial expressions which showed only sparse activation in frontal cortex, interferenceresolution during processing of negative facial expressions resulted in specific activations inregions (V3a, MT+, STS) which might be involved in processing of implicit dynamics ofnegative expressions. Thus, functional activations in visual processing regions mightspecifically be related to processing demands of different expressions.

© 2009 Elsevier B.V. All rights reserved.

Keywords:Emotional interferenceFacial expressionContextual distractionInterference resolutionfMRI

1. Introduction

From an ecological perspective, facial expressions alwaysappear in a broader temporal and spatial context. Contextualfeatures are often relevant for a unique interpretation of facialexpressions, but emotional incongruent contextual featuresmight also introduce ambiguity during recognition of facialexpressions.

psychology and BehaviorBremen, Germany. Fax: +4e (S. Frühholz).

er B.V. All rights reserved

Several studies by de Gelder and her colleagues (Meerenet al., 2005; Righart and de Gelder, 2006) demonstrated thatdifferent components of event-related potentials during emo-tional face processing are modulated by contextual stimulusfeatures. These authors, for instance, demonstrated amodula-tion of the visual P1 component by emotional incongruentcompounds of facial expressions and emotional body postures(Meeren et al., 2005). Additionally, a modulation of the face-

al Neurobiology, Center for Cognitive Sciences (ZKW), University of9 421 218 68759.

.

144 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

specific N170 component depending on the emotional con-gruence of facial expression and the content of a backgroundscenehasbeen reported (Righart anddeGelder, 2006). Basedonthe assumption that the visual P1 component putativelyoriginates in extrastriate cortex (Di Russo et al., 2005) and thevisual N170 component in temporo-occipital regions (Pizza-galli et al., 2002; Itier and Taylor, 2004), these data point to amodulation of early visual processing by emotionally conflict-ing material. Furthermore, examining a patient with a right-sided hemianopsia, de Gelder et al. (2005) observed an effect ofcongruence for fearful facial expressions presented both in theblind and the intact hemifield with a signal increase in theamygdala and fusiformgyrus. These data support the viewof adistinctive role of both the amygdala and the fusiform gyrus inface related emotional interference processing.

The studies from de Gelder and colleagues (Meeren et al.,2005; Righart and de Gelder, 2006) suggest a modulatory effectof congruent and incongruent contextual information duringcategorizing facial expressions which already appears duringearly processing in extrastriate visual regions. However,modulatory effects of contextual emotional information arenot restricted to extrastriate visual processing regions but alsoseem to include more dorsal brain regions involved inexecutive control during contextual interference. For example,by introducing task-irrelevant emotional information whilemaintaining the focus of attention on the non-emotional colorof a word (the so-called «emotional Stroop task»), Compton etal. (2003) reported a widely distributed almost left lateralizedactivation pattern in medial and lateral frontal, inferiorparietal, superior and inferior temporal regions as well as inright fusiform gyrus. Furthermore, in a face-word Stroop-liketask with conflicting emotional information in both stimulusdimensions, Etkin et al. (2006) separated regions responsiblefor emotional conflict detection (medial and lateral frontallobe, amygdala) from the rostral anterior cingulate cortex(rACC). The latter is supposed to resolve emotional conflicts bymeans of inhibiting amygdala activation. Unfortunately, thefunctional analysis of this study was restricted to these brainregions allowing no inferences to attentional control mechan-isms in posterior brain regions. Egner et al. (2008) confirmedthis inhibitory effect of the rACC on the amygdala duringemotional conflict resolution.

Fig. 1 – Temporal cycle of the trial sequence in the second expean ISI of 3600 ms (jitter±200 ms). Subjects were asked for a fof negative, neutral, and positive expressions in a three-alternsubjects had to rate the valence and arousal of facial expresscombinedwith a red, neutral expressionwith a blueandpositive ethese face–color-combinations were repeated (congruent trials; e(incongruent trials).

Therefore, contextual interference during recognition offacial affect seems to involve activations in extrastriate visual,frontal as well as limbic brain regions. More specifically, thereseems to be a dynamic interplay between frontal areas inrelation to both limbic (Etkin et al., 2006; Egner et al., 2008) andvisual brain areas (Compton et al., 2003). Especially the lattercoupling seems important since it suggests different levels ofconflict control. Whereas fronto-parietal regions may providemore general conflict control, modulatory effects in visualregionsmight reflect regional competition between the proces-sing of different stimulus features (Desimone and Duncan,1995) and the promotion of task-relevant stimulus features(Egner and Hirsch, 2005). Explanatory analyses in the study byEgner et al. (2008) revealed activations in posterior occipital,temporal and parietal brain regions, but these activations wererather unspecifically associated with emotional conflict detec-tion and conflict resolution during emotional face processing.Interestingly, non-emotional conflict resolution during faceperception resulted in a functional connectivity between theleft prefrontal cortex and the fusiform face area (FFA; seeKanwisher et al., 1997) suggesting an enhanced processing offacial stimuli in sensory cortex during task induced interfe-rence (see Egner and Hirsch, 2005). This was apparently not thecase during emotional conflict resolution. The above men-tioned studies by de Gelder and co-workers, however, suggestspecific modulatory effects of emotional incongruent contex-tual features on categorizing emotional expression in extra-striate visual cortex. More specifically, valence judgments offacial expression during contextual interference should involveactivations in specific visual regions which promote theprocessing of specific emotional expressions. In this case,functional activations in extrastriate visual regions responsiblefor processing featural and configural aspects of specificemotional expressionsmight be specifically rather than unspe-cifically necessary for a proper judgment of these expressions.

To examine whole brain physiological aspects of proces-sing emotional interferences we developed an experimentaltask design which should allow to separate contextualinterference processing from emotional face processing.More specifically, we were interested in brain regions whichare involved in both general interference resolution andspecifically in the processing of the task-relevant facial

rimental run: each picture was presented for 1400 ms withast and accurate classification of the emotional valenceative forced-choice categorization. In a preceding run,ions where negative expressions were systematicallyxpressionswith a yellowbackground color. In the second run,xamples are shown in the figure) or systematically changed

145B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

expressions. Therefore, we introduced a second low-levelfeature emotional stimulus dimension to create a conflict inemotional processing during a categorical judgment of thevalence of facial stimuli (see Fig. 1 for stimuli, trial sequenceand task instructions). Thus, in the task design of the presentstudy pictures of negative (fear), neutral and positive (happi-ness) facial expressions were simultaneously presented incombination with specific background colors. These back-ground colors were implicitly associated with specific emo-tional valences during a first experimental run. Though ineveryday social interactions we do not regularly encounterfacial expression in a colored context, these social situationsoften contain stimulus features with specific emotionalassociations. Advertisement, for example, is full of differentfacial expressions and simple stimulus features, such ascolors. Over the life course, we learn to associate colors withspecific emotional meanings (Elliot and Maier, 2007), a factthat is often utilized by product placement strategies.

Therefore, in the present experiment, we assumed that theemotional association as well as low-level processing char-acter of these background colors will induce substantialinterference effects which even might bias competition inearly visual processing stages. Furthermore, this activation inearly visual areasmight critically and differentially depend onthe valence of facial expressions. We, therefore, includednegative, neutral and positive expressions in the presentexperiment. The categorization of positive expressions, forexample, strongly relies on specific facial features (Williams etal., 2001; Green et al., 2003). Attention to these features mightbe enhanced when emotional expressions have to be categor-ized during contextual interference. This enhanced processingof facial features is indicated by the above mentionedmodulatory influences on early visual ERP components(Meeren et al., 2005; Righart and de Gelder, 2006) but allowsonly a coarse estimation of the spatial distribution of involvedvisual areas. Functional imaging studies found only unspecificactivations in visual areas (Etkin et al., 2006; Egner et al., 2008),but mainly focused on frontal brain regions associated withinterference control mechanisms. However, we assumed thatthese reported frontal areas act specifically in concert withcircumscribed visual processing areas.

According to the hypothesis of a distributed interplaybetween remote brain areas we tried to examine differenthypotheses within the present experiment: (1) we consideredtwo systems for regulating attentional control in contextual

Table 1 – Behavioral data.

Neg

Reaction time (ms) Congruent 841Incongruent 936

Error (%) Congruent 11.00Incongruent 20.83

Type-of-error (%) Specific 14.00Unspecific 6.83

Mean reaction times and percent error rates for incongruent and congrueand unspecific errors (unrelated to the background color) for incongruen

interference resolution during recognition of facial affect. Thefirst system of attentional control (1a) comprises a fronto-parietal network involved in top-down attentional control byfocusing or shifting selective attention to the task-relevantfacial expression or inhibiting the processing of the task-irrelevant color information (Banich et al., 2000; Behrmann etal., 2004). More specifically, depending on the processingdemands of emotional and neutral facial expressions weexpected a differential activation pattern associated withinterference resolution during processing of facial expressions(e.g., Williams et al., 2001; Green et al., 2003; Leppänen andHietanen, 2004). Emotional expressions have been assumed tobe processed pre-attentively (Vuilleumier et al., 2001) andefficiently (Green et al., 2003; Leppänen and Hietanen, 2004).This preference in processing of emotional expressions mightattenuate the impact of contextual interference when com-pared to neutral expressions and, consequently, less inter-ference control has to be exerted by fronto-parietal brainregions. The second system of attentional control (1b) coversventral brain regions, such as the amygdala or the ventrome-dial prefrontal cortex (VMPFC), which is supposed to detectsalient events which could be of task-related importance (e.g.,Vuilleumier et al., 2004; Carretié et al., 2005). In particular, thetask-irrelevant but emotionally associated background colorshould convey important emotional information which mightattract attention. In case of emotional incongruent informa-tion, the amygdala might be sensitive to emotional incon-sistencies (see Whalen, 1999), but also to contextualinformation when facial expressions signal less clear emo-tional information (Kim et al., 2004). Furthermore, a bottom-upnetwork comprising the temporo-parietal and inferior frontalcortex has been proposed to detect salient events, and whichis assumed to be a circuit breaker of top-down attentionalcontrol (Corbetta and Shulman, 2002).

(2) The second hypothesis, referred to as «local» controlhypothesis, is based on electrophysiological studies showingcontextual modulation on emotional face processing in extra-striate regions (Meerenet al., 2005; Righart anddeGelder, 2006).Electrophysiological studies, however, provide only poorspatial resolution to locate early effects in visual brain regions.We, therefore, used functional magnetic resonance imaging(fMRI) to investigate specific extrastriate brain regionsinvolved in attentional control for contextual influencesduring face processing. There is evidence for selective atten-tional processes which already affect extrastriate visual

Facial expressions

ative Neutral Positive

(110) 860 (125) 819 (66)(92) 990 (143) 904 (56)(8.91) 10.16 (13.29) 5.61 (4.40)(11.98) 26.66 (12.18) 10.83 (10.58)(1.26) 20.33 (2.31) 7.83 (1.21)(6.79) 6.33 (.66) 3.00 (1.23)

nt trials, and percentage of specific (primed by the background color)t trials. Standard error of the mean (SEM) in brackets.

Table 2 – Regions of significant activation for the [incongruent>congruent] contrast of each emotional category exclusivelymasked by the respective color contrast at the same statistical threshold.

Brain region Right/left BA Cluster size T x y z

Negative expressionsFrontal lobeMiddle frontal gyrus L 6 43 3.89 −36 1 61

L 10 23 3.73 −46 49 9L 11 233 5.04 −44 50 −13L 47 3.91 −50 46 −6

Inferior frontal gyrus L 9 414 4.25 −53 17 32L 9 4.03 −46 11 27L 47 3.62 −42 40 −19R 9 222 4.18 48 13 23

Parietal lobeSuperior parietal lobule L 7 5.11 −26 −63 51

L 7 5069 5.45 −28 −64 40Precuneus L 7 4.86 −12 −69 53

Temporal lobeInferior temporal gyrus L 37 3.29 −55 −55 −2

L 37 524 5.03 −44 −58 −4R 20 1516 4.64 57 −41 −10

Fusiform gyrus L 37 4.24 −48 −51 −13R 37 4.55 42 −42 −18R 37 4.26 50 −63 −14

Occipital lobeMiddle occipital gyrus L 18 36 3.50 −40 −87 1Lingual gyrus R 18 237 4.57 14 −85 −23

Subcortical regionsThalamus L 19 3.60 −10 −15 6

CerebellumPyramis L 44 3.76 −10 −83 −28

L 3.38 −24 −60 −27Uvula L 54 3.42 −32 −69 −25

Neutral expressionsFrontal lobeMedial frontal gyrus R 8 161 4.15 2 24 45Superior frontal gyrus L 11 10 4.32 −24 40 −19Middle frontal gyrus L 9 696 6.59 −55 21 30

L 11 391 5.40 −42 50 −14R 6 51 4.18 44 8 53R 10 197 5.28 44 56 −4R 11 3.78 42 56 −13R 11 15 3.85 38 40 −20R 46 638 5.12 59 28 23

Inferior frontal gyrus L 44 3.39 −46 9 18L 46 3.68 −51 39 9L 47 30 3.46 −28 23 −8R 9 4.49 42 9 25R 45 3.56 48 14 14R 47 3.85 51 42 −14R 47 145 3.73 32 23 −6R 47 3.54 50 23 −15R 47 3.49 42 24 −20

Parietal lobeSuperior parietal lobule L 7 2047 5.18 −26 −60 40

L 7 4.86 −10 −73 53R 7 604 4.31 32 −60 44R 7 4.16 30 −69 55R 7 73 4.00 14 −71 55

Intraparietal sulcus R 7 3.97 32 −52 47Inferior parietal lobule L 40 4.87 −48 −37 42

Temporal lobeFusiform gyrus L 22 3.69 −48 −50 −23

L 13 3.38 −38 −58 −20Occipital lobeLingual gyrus L 17 5.52 −18 −84 −1

146 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

Table 2 (continued)

Brain region Right/left BA Cluster size T x y z

L 18 6324 6.34 0 −83 1R 18 5.77 10 −82 −4

CerebellumPyramis L 56 4.55 −38 −66 −34

R 69 3.63 12 −79 −25R 10 3.40 34 −70 −34

Positive expressionsFrontal lobeSuperior frontal gyrus L 6 18 3.62 −22 −5 65Middle frontal gyrus L 6 22 3.50 −30 −1 55

All reported regions are significant at p<.001 (uncorrected) with a spatial extent threshold of k=10. Abbreviations: BA = Brodmann area; x, y,z = coordinates of peak activations in Talairach coordinates.

147B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

processing stages by promoting task-relevant in spite of task-irrelevant stimulus features (Desimone and Duncan, 1995).Thus, distraction from emotional face processing should beassociated with activations in extrastriate regions whichindicate theprocessingof specific stimulus featuresdependingon the processing of specific emotional expressions. Undertask conditions of contextual interference, those stimulusfeatures should be biased which allow for a proper valencejudgment of facial expressions.

2. Results

2.1. Behavioral data

We compared incongruent and congruent trials (congruence)in each of the three valence categories of facial expressions(emotion: negative, neutral, positive). For each subject wecomputedmean reaction times (RTs) including only trials withcorrect responses within a time window of 150–2000 ms (seeTable 1). Kolmogorov–Smirnov tests on RTs and error data didnot reach significance (all p>.134) thus indicating no violationof the assumption of a normal distribution of the behavioraldata.

A 3 (emotion)×2 (congruence) repeated measure ANOVAfor RTs revealed main effects for both the factor emotion(F2,38=3.77, p=.032) and the factor congruence (F1,19=68.91,p<.001). Post-hoc Bonferroni adjusted comparison showedthat RTs for incongruent trials (944 ms, SEM=18) weresignificantly increased compared to congruent trials (840 ms,SEM=19; p<.001). There was also a significant emotion×congruence interaction (F1.5,28.6=4.46, p=.018, Greenhouse–Geisser (GG) corrected) elicited by a stronger increase in RTsfor incongruent trials with neutral expressions. Paired t-testscomparing RT differences between the congruent and incon-gruent condition in each emotional category confirmed astrong interference effect for each contrast (negative t19=6.77,p<.001; neutral t19=6.70, p<.001; positive t19=6.87, p<.001).

An almost identical pattern was obtained for percent errorrates for misclassifying facial expressions (see Table 1). Arepeatedmeasure ANOVA showed significant main effects forthe factors emotion (F2,38=5.22, p= .010) and congruence(F1,19=44.38, p<.001), and a significant emotion×congruence

interaction (F2,38=1.54, p<.001) explained by stronger effects ofincongruence with neutral expressions. Paired t-tests showedinterference effects for the negative (t19=4.51, p<.001), neutral(t19=7.39, p<.001), and positive emotional category (t19=2.67,p=.015) by comparing the congruent and incongruent trials ineach category.

To specifically address erroneous task performance duringcontextual interference in incongruent trials we classifiederrors in these trials as specific or unspecific errors. Thisapproach should demonstrate that the emotional associationof the background colors as implicitly induced in the firstexperimental run is able to induce specific interference effects,thus, resulting in significantly more specific errors comparedto unspecific errors. Specific errors during incongruent trialsare primed by the emotional association of the backgroundcolors whereas unspecific errors were unrelated to thisassociation of the colors. Unspecific errors only indicate ageneral inattentiveness to the task. In specific errors, however,the background color induced a specific erroneous responsedue to its emotional association which is incongruent to thevalence of the simultaneously presented facial expression.Specific and unspecific errors during incongruent trials weresubjected to a 3 (emotion: negative, neutral, positive)×2 (type-of-error: specific, unspecific) repeatedmeasure ANOVA. Therewas a significant main effect for emotion (F2,38=8.30, p=.001)and type-of-error (F1,19=76.58, p<.001) as well as a significantemotion×type-of-error interaction (F2,38=5.89, p=.006). Post-hoc Bonferroni adjusted comparison revealed that specificerrors (14.06%, SEM=.90)were significantlyhigher compared tounspecific errors (5.50%, SEM=.62; p< .001). In particular,specific compared to unspecific errors during categorizationsof neutral expressions weremore strongly increased (t19=7.63,p<.001) compared to negative (t19=3.56, p=.002) and positiveexpressions (t19=2.93, p=.009; see Table 1).

2.2. Imaging data

In a first approach, we analyzed interference effects in eachemotional category and we compared incongruent againstcongruent trials for each emotional category masked by therespective color contrasts (see Table 2 and Fig. 2). Incongruenttrials with negative expressions were associated with asignificant signal increase in left middle and bilateral inferior

Fig. 2 – Functional activation for the [incongruent>congruent] contrasts for (a) negative, (b) neutral and (c) positive expressions(p<.001, uncorrected, k=10). Only sparse left frontal activations were found for incongruent trials with positive expressions.Incongruent trials with negative and neutral expressions revealed specific activation patterns in extrastriate visual regions(right FG (~FFA)) for both negative and neutral expressions and in V4 and IOG (~OFA) for neutral expressions. Interactionanalyses confirmed these specific activations. Incongruent trials with (d) negative expressions revealed distinct activations inV3a and right hemispheric MOG (~MT+) and MTG (~STS); incongruent trials including (e) neutral expressions showed specificactivations in V4 and IOG (~OFA). (f) Percent signal change for right hemispheric ROIs in FG, V4, IOG andMTG (light bars indicatecongruent trials, dark bars indicate incongruent trials). Functional contrasts are rendered on the human Colin atlasimplemented in the CARET software (Van Essen et al., 2001). Coordinates refer to the Talairach space (Talairach and Tournoux,1988); error bars indicate the standard error of the mean (SEM). Abbreviations: FFA, fusiform face area; FG, fusiform gyrus; IOG,inferior occipital gyrus; MOG, middle occipital gyrus; MTG, middle temporal gyrus; OFA, occipital face area; STS, superiortemporal gyrus.

148 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

frontal gyrus. Inposterior regionsweobservedactivations in leftsuperior parietal lobule (SPL) extending to intraparietal sulcus(IPS) and inferior parietal lobule (IPL). Furthermore, we obtainedpeak activations in ventral (fusiform gyrus) and ventro-lateraltemporal regions (inferior temporal gyrus and sulcus) aswell asin bilateral brain areas in the antero-medial cuneus.

Incongruent compared to congruent trials with neutralexpressions resulted in bilateral activations in middle frontaland right inferior frontal gyrus as well as in right medial andleft superior frontal gyrus. Parietal activations were located inbilateral SPL and IPS as well as left-sided activations in the IPL.Additionally, we observed activations in lateral occipitalregions in left middle and right inferior occipital gyrus aswell as bilateral activations in lingual and fusiform gyrus.

Contrasting incongruent with congruent trials includingpositive expressions resulted in left-sided activation foci insuperior and middle frontal gyrus.

A conjunction analysis including all [incongruent>con-gruent] contrasts (see Table 3 and Fig. 3) revealed peakactivations in right medial, bilateral middle and inferiorfrontal gyrus and sulcus, in IPS, SPL and inferior parietallobule (IPL), as well as in bilateral fusiform (BA 19/37) and leftinferior temporal gyrus.

An interaction analysis revealed specific activations forincongruent trials with negative and neutral expressions only(see Table 4 and Fig. 2). For incongruent trials with negativeexpressions there was a bilateral occipital activation patternmedially located in cuneus and lingual gyrus and extending

Table 3 – Regions of conjunct activation including all [incongruent>congruent] contrasts.

Brain region Right/left BA Cluster size T x y z

Frontal lobeMedial frontal gyrus R 8 30 2.39 6 20 47Middle frontal gyrus L 9 169 2.83 −53 23 34

R 9 166 2.89 40 11 27Inferior frontal sulcus L 44 11 3.29 −36 7 29Inferior frontal gyrus L 9 2.14 −50 13 27

L 47 36 2.31 −32 21 −6R 47 68 2.49 34 23 −6

Parietal lobeSuperior parietal lobule L 7 2.63 −26 −63 57

R 7 399 2.77 36 −64 49Intraparietal sulcus L 40 2.73 −44 −38 48

L 40 1126 2.80 −34 −45 41R 7 2.53 34 −64 35

Inferior parietal lobule R 40 2.74 36 −54 40Occipital lobeFusiform gyrus L 19 12 3.37 −44 −68 −8

R 19 46 2.53 44 −72 −12Temporal lobeInferior temporal gyrus L 37 17 3.35 −46 −57 −6Fusiform gyrus L 37 57 2.21 −46 −47 −16

CerebellumPyramis R 10 3.57 32 −68 −34

Regions of conjunct activation are significant at p<.001 (uncorrected; T>1.89, intermediate null hypothesis). Spatial extend threshold was k=10.

149B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

laterally into the right middle occipital gyrus (MOG). Thisoccipital activation extended ventrally to the right parahippo-campal gyrus (BA 19, 30) and dorsally to the right posteriorcingulate cortex (BA 30). Additionally, there was a right lateralactivation cluster comprising areas of middle and superiortemporal gyrus. Incongruent trials including neutral expres-sions also revealed activation clusters in occipital cortexwhichwere located more ventrally comprising early visual brain

Fig. 3 – Conjunction analysis (intermediate null, p<.001, uncorreincongruent compared to congruent trials. (b) Percent signal chanconjunction analysis. The figure shows bar charts for ROIs in rig(IPL), in middle frontal gyrus (MFG) and in inferior frontal gyrus (atlas implemented in the CARET software (Van Essen et al., 2001Tournoux, 1988); error bars indicate the standard error of the me

regions of the lingual gyrus and cuneus extending ventrallyinto the fusiform gyrus and laterally into the MOG (BA 18) andthe inferior occipital gyrus (IOG, BA 18). An additional cluster ofactivation was located in right middle frontal gyrus.

We computed percent signal change for different regions ofinterests (ROI) and compared signal intensity of incongruentcompared to congruent trials in each emotional category(paired t-tests, p<.05). ROIs derived from the conjunction

cted, T>1.89, k=10). (a) Regions of conjunct activations of allge (PSC) in regions of interests (ROIs) determined by the

ht medial frontal gyrus (MeFG), in left inferior parietal lobuleIFG). Functional contrasts are rendered on the human Colin). Coordinates refer to the Talairach space (Talairach andan (SEM).

Table 4 – Regions of distinct activation as revealed by an interaction analysis for incongruent trials including negative andneutral expressions.

Brain region Right/left BA Cluster size T x y z

Negative expressionsTemporal lobeSuperior temporal gyrus R 41 34 3.64 40 −30 16

R 22 10 3.62 67 −34 18Middle temporal gyrus R 21 147 3.68 57 −60 1

R 21 3.42 50 −50 1Occipital lobeMiddle occipital gyrus R 39 3.36 53 −68 9

R 19 10 3.47 55 −66 −8Lingual gyrus L 18 4.27 −10 −75 7Cuneus L 18 553 4.43 −2 −86 19

L 19 33 3.96 −30 −86 37R 18 4.33 14 −77 22

Limbic lobeParahippocampal gyrus R 19 15 3.93 26 −49 −1

R 30 3.35 20 −46 8Posterior cingulate cortex R 30 16 3.75 18 −54 14

R 30 10 3.53 4 −58 12

Neutral expressionsFrontal lobeMiddle frontal gyrus R 10 14 3.65 44 56 −4

Occipital lobeLingual gyrus L 17 4.50 −8 −89 6

R 18 4.58 10 −82 −4Cuneus R 17 2203 4.63 16 −91 3

Regions of specific activation are significant at p<.001 (uncorrected) with a spatial extent threshold of k=10. Abbreviations: BA = Brodmann area;x, y, z = coordinates of peak activations in Talairach coordinates.Functional activations were masked by the respective color contrasts at the same statistical threshold.

150 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

analysis showed a significant signal increase for all incon-gruent compared to congruent trials in each emotional facecategory (see Fig. 3). Neutral incongruent trials led tosignificantly higher signal increases bilateral in V4, bilateralin the fusiform face area (FFA) and in the right occipital facearea (OFA), whereas negative incongruent trials only showedincreased signals in bilateral fusiform gyrus and in rightmiddle temporal gyrus (∼STS; see Fig. 2). Positive incongruenttrials failed to show a significant signal increase in rightfusiform gyrus (∼FFA, p=.078).

3. Discussion

The present study examined neuronal correlates of inter-fering emotional face processing by background colorswhich were associated with different emotional valences.Incongruence of emotional valence between facial expres-sions and background colors induced significant interfer-ence effects. Behavioral data indicate that incongruenttrials resulted in increased reaction times and decreasedresponse accuracy. The background colors induced specificerroneous responses during incongruent trials due to theiremotional association. This was indicated by significantlyincreased specific compared to unspecific errors. Thus,background colors obviously gained a specific emotionalassociation during the first experimental run and were ableto induce specific interference effects during incongruent

trials in the second experimental run. Furthermore, in aseries of pilot studies (see Section 4.4) iterative valencejudgments on the emotional association of these colorsadditionally indicated that these colors were actuallyassociated with an emotional association as intended bythe present experimental procedure. During the course ofthe experiment colors were increasingly categorized accord-ing to the valence association as intended by the experi-mental procedure.

Differential effects of interference resolution were alreadyapparent on a behavioral level indicated by an enhanced effectof incongruence for incongruent compared to congruent trialswith neutral expressions for both RTs and response errors.These differential effects were also reflected in distinctfunctional correlates of interference resolution during proces-sing of different facial expressions. In accordance with ourfirst hypothesis (1) we found a fronto-parietal network (1a)which was commonly activated by incongruent trials irre-spective of the type of facial expression (Section 3.1), but wealso revealed differential effectswith respect to different facialexpression (Section 3.2). The second hypothesis of an atten-tional system (1b) which acts as a salience detector was notconfirmed by our data (Section 3.1). However, we founddifferential activation patterns in extrastriate visual regiondepending on the valence of facial expressions (Section 3.3).These extrastriate functional activations most likely indicatethe processing of stimulus- and emotion-specific featuresfor a proper valence judgment of facial expressions and

151B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

correspond to our second hypothesis (2) about localregulatory processes in extrastriate visual brain areas.

3.1. Common brain regions during interference processing

Irrespective of the emotional expression, incongruent com-pared to congruent trials activated regions of a fronto-parietalnetwork which corresponds well to our first hypothesis of top-down attentional control. A conjunction analysis revealedfrontal activations in the posterior fronto-median wall (BA 8)which might be associated with performance monitoring(Ullsperger and von Cramon, 2004). While regions in dorso-lateral prefrontal cortex (BA 9) have been discussed to beassociated with task-related attentional demands (Banich etal., 2000), brain areas in the right ventro-lateral cortex havebeen shown to support the inhibition of prepotent responses(Aron et al., 2004). Interestingly, incongruent trials did notactivate dorsal «cognitive» or rostral «affective» subdivisionsof the anterior cingulate cortex (ACC; as proposed by Bush etal., 2000). Recent studies (e.g., Nachev et al., 2005) ratherindicate that more superior regions in medial frontal wall areinvolved in several functions originally assigned to the dorsalACC (dACC). This evidence is corroborated by Compton et al.(2003) who did not find ACC activation and who argued forpractice effects which might diminish conflict-related activa-tions in the ACC. Though Etkin et al. (2006) reported a crucialinvolvement of the ventral ACC in resolving emotionalconflicts, this activation was only present in the case of«high conflict resolution trials» (incongruent trials followingincongruent ones). The present data analysis, however, wasbased on all incongruent trials independent of the nature ofthe preceding trial. Nevertheless, the present analysisrevealed a similar region in medial frontal gyrus (BA 8) to beactivated for emotional conflicts as reported by Etkin et al.(2006). Furthermore, Egner et al. (2008) reported an adjacentregion in dACC responsible for emotional as well as non-emotional conflict monitoring.

Conjunct activation patterns in posterior parietal cortexrepresent the posterior part of the fronto-parietal attentionnetwork which directly serves the control of selective atten-tion. Egner and Hirsch (2005) assumed that these parietalregions might regulate information flow in extrastriate visualprocessing streams probably by means of amplifying theprocessing of the task-relevant stimulus feature (facialexpression in the present study). This line of argumentationis supported by the present data indicating a significantactivation in bilateral fusiform gyrus particularly in the lefthemisphere (BA 37) corresponding to the fusiform face area(FFA). Though the FFA is usually localized in the righthemisphere there are also studies indicating a more bilateralrepresentation in women (e.g., Proverbio et al., 2006).

We expected to find activations in ventral systemsespecially in regions which signal and promote the processingof specific salient stimuli. Unlike peak activations particularlyin right inferior frontal gyrus (BA 47) whichmight underlie theassessment (Nakamura et al., 1999) or mirroring (Lee et al.,2006) of the emotional quality of the depicted faces, we did notfind activations in ventral limbic regions. More specifically, wedid not find activations in the amygdala which was shown topromote the processing of task-relevant stimulus properties

in a bottom-up manner (Vuilleumier et al., 2004), in particularwhen facial expressions convey emotional ambiguity (Kim etal., 2003, 2004).The amygdala, however, shows strong signalsuppression during consecutive repetition of emotional faces(Ishai et al., 2004) as well as top-down suppression by ventralfrontal regions in during enhanced cognitive evaluation ofstimuli (Nomura et al., 2004; Etkin et al., 2006; see alsoCompton et al., 2003). Both of these effects might havecontributed to the missing amygdala activation in the presentstudy.

3.2. Distinct frontal brain areas of interference resolution

Incongruent trials with positive facial expressions generatedonly sparse activations in the superior and middle frontalcortex. These regions in the vicinity of the frontal eye field (BA6) are supposed to maintain and orient the focus of attention(Serences et al., 2005; Thompson et al., 2005). Positive facialexpressions have been shown to be categorized fast andaccurate (Leppänen and Hietanen, 2004) with less demandingvisual scan paths. This visual scanning of positive expressionsis focused on distinguishing facial features (Williams et al.,2001), which reduces the processing demands of positiveexpressions and, therefore, their vulnerability to contextualinterference (for the relationship of low working memorydemand and decreased distractibility, see Lavie, 2005). Pre-sumably, the lower processing demands of positive facesmight shift the locus of interference control to later stages ofresponse selection.

In contrast to incongruent trials with positive expressions,incongruent trials with negative and neutral facial expres-sions generated enhanced interference effects as indicated byboth behavioral data and a frontal and posterior parietalactivation pattern. A similar activation cluster for bothnegative and neutral incongruent trials in the left middlefrontal gyrus (BA 9) might result from the task-relatedadjusting of attentional resources (Banich et al., 2000).Increased activation in this region was related to enhanced(and successful) interference control (see Bunge et al., 2001).Particularly, incongruent trials with neutral expressionsmightresult in an increased demand for interference control andresolution which is also indicated by peak activations in rightinferior frontal gyrus. Here, a more caudal activation (BA 47)was associated with the inhibition of prepotent responses dueto interfering color information (Aron et al., 2004), and themore rostral part (BA 10, 11) was related to the evaluation ofthe emotional valence of neutral expressions (Nakamura et al.,1999). The latter argumentation seems plausible since neutralexpressions are ambiguous themselves compared to emo-tional expressions (as also revealed in the pilot studies whereneutral stimuli showed a broader distribution of misclassifi-cation). This ambiguity might be even enhanced whenpresented with background colors containing interferinginformation.

3.3. Selectively activated visual regions depending on thevalence of facial expressions

In addition to the above described conjunct activation patternin fronto-parietal regions and in accordance with our «local»

152 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

hypothesis, interaction analyses revealed regions of distinctactivations for both negative and neutral incongruent trials.We found activations in dorsal and ventral visual processingstreams and the most prominent activations were found inextrastriate visual areas. Egner et al. (2008) also reportedoccipital and temporal brain regions to be activated duringemotional conflict detection and resolution though theseactivations seemed to be unspecifically related to bothemotional as well as non-emotional conflict detection andresolution. In the present study, however, functional activa-tions in extrastriate visual cortex might reflect specificactivations related to the distinct processing properties ofneutral as well as of emotional expressions, and this seems tobe especially the case for processing emotional expressionsduring contextual interference.

For incongruent trials with neutral expressions weobserved an activation pattern which comprises regions inthe lingual gyrus. These regions are supposed to supportvisual object naming (Kiyosawa et al., 1996) and, morespecifically, proper face processing (Tempini et al., 1998).Furthermore, neutral incongruent trials also led to a signalenhancement in regions associated with the processing ofstimulus properties of the face–color compounds. Weobserved a significant signal increase in area V4 probablyrelated to the processing of the task-irrelevant color, but alsoin the mid-lateral fusiform gyrus (BA 37) and the inferioroccipital gyrus (BA 18). These fusiform and occipital activa-tions might result from categorical (FFA) and detailed faceprocessing (OFA) (e.g., Kanwisher et al., 1997; Rossion et al.,2003; Schiltz et al., 2006). Egner and Hirsch (2005) also reporteda signal increase in FFA during face perception in trials withcontextual conflicting information which according to theauthors might reflect a mechanism of selective attentionalcontrol.

Beside a feed-forward signal transfer, this activationpattern in ventral visual cortex including activations in V4,OFA and FFA might partly reflect signal re-entry from higher-order visual areas which might serve a more detailed objectprocessing. In addition to a feed-forward signal from OFA toFFA, Schiltz et al. (2006) proposed a reverse signal transferfrom FFA (face categorization) to OFA (detailed face proces-sing). Based on the present data for incongruent trials withneutral expressions, we presume a neural scenario with asignal increase in V4 for color processing and FFA for facecategorization, but additionally with a biased local top-downsuppression from FFA (processing the task-relevant facialexpression) to V4 (processing the task-irrelevant backgroundcolor). Furthermore, we suppose a signal re-entry from FFA toOFA for a proper judgment of emotional face morphology inthe case of increased interference. Taken together, sinceneutral expression signal less expressional distinctivenessboth the task-relevant and task-irrelevant stimulus featuresare processedmore deeply to find appropriate information fora valid categorization of facial expressions.

In contrast, incongruent trials with negative expressionsactivated regions centered in bilateral dorsal cuneus withadditional activations in the right middle occipital (BA 39) andmiddle temporal gyrus (BA 37). Processing negative, higharousing emotional stimuli or novel facial displays is oftenassociated with significant activations in (medial) occipital

areas (Lang et al., 1998; Clark et al., 1998). Medial cuneus andprecuneus have also been reported to be involved in attentionshifting between stimuli features (Le et al., 1998). In addition tothese medial occipital activations, we found activations inbilateral dV3/V3a and right V5/MT+ for incongruent trials withnegative expressions. These brain areas are assumed toprocess both real as well as implicit motion induced by staticpictures (Vaina et al., 2003; Krekelberg et al., 2005). Some ofthese brain areas also have been shown to be involved inprocessing implicit dynamics or changeable stimulus featuresof facial stimuli (Allison et al., 2000) which might be evenextracted during recognition of static emotional expressionsas in the present study. According to the «space fragmenttheory» (Davies and Hoffmann, 2003) even static displays offacial expressions retrieve implicit dynamics of facial mor-phology which might assist proper face categorization.Furthermore, we found an activation pattern comprising therightmiddle temporal gyrus and the superior temporal sulcus.The latter area corresponds to the functionally defined regionSTS (see Allison et al., 2000) comprising the postero-superiorand -middle parts of the middle temporal gyrus. The STSregion has been discussed to be involved in processing mouthor eye dynamics of emotional expressions (Narumoto et al.,2001). Taken together, processing negative expressions duringtrials of contextual interference, therefore, seems to activatebrain regions which might extract implicit expressionaldynamics of negative expressions. Empathic mirroring ofthese implicit dynamics might help to properly categorizethese facial expressions.

3.4. Conclusions

Differential behavioral effects of contextual interferenceduring emotional face processing were mirrored on a brainphysiological level. A conjunct activation patternwas found inmedial and lateral frontal regions and in posterior parietalcortex which is assumed to subserve conflict detection andconflict resolution. Furthermore, we found specific activationpatterns depending on the valence of facial expressions.Contextual interference while processing negative and posi-tive facial expressions wasmarkedly reduced when comparedto neutral expressions. It was already shown that emotionalexpressions can strongly capture attention and are partlyprocessed in a pre-attentive manner (e.g., Vuilleumier et al.,2001) and, therefore, might be less prone to contextualdistraction. This was clearly indicated by sparse functionalinterference effects during processing of positive expressions.Compared to neutral expressions, interference during proces-sing of negative expressions also revealed attenuated effectsof interference. Still, negative facial expressions may containsome emotional and motivational ambiguity (Whalen, 1999)which probably determines in depth processing of implicitdynamics (V3a, MT+, STS) for a proper valence categorization.Neutral expressions, finally, involve a higher emotionaluncertainty making them more vulnerable for interferenceinduced by contextual color information. In order to generatean accurate valence categorization of neutral expressionsvisual processing requires an enhanced analysis of facialexpressions (FFA, OFA) and background color (V4) in moredetail. The observed functional activations in extrastriate

153B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

visual areas for negative and neutral expressions might bespecifically related to the essential properties of theseexpressions.

4. Experimental procedures

4.1. Participants

We investigated 23 healthy female students recruited fromBremen University campus. Only females were included inorder to prevent confounding effects of gender differences inemotional processing (e.g., Wager et al., 2003). All subjectswere right-handed (Oldfield, 1971), had normal or corrected tonormal vision and normal color vision (Ishihara color-tables;Ishihara, 1974). No subject presented a neurologic or psychia-tric history. Three subjects had to be discarded from furtheranalysis because of technical problems or error rates abovechance. The final sample consisted of 20 subjects (mean age22.7 years, SD=2.25, age range 20–29 years). All subjects gaveinformed and written consent for their participation inaccordance with ethic and data security guidelines of theUniversity of Bremen. The study was approved by the localethics committee.

4.2. Stimulus material

The stimuli were based on photographs of people showingfearful (negative), happy (positive) and neutral facial expres-sions and which were pictographically reduced to black-and-white colors (see Fig. 1). This pictographically reductionallowed for a better combination of emotional expressionswith specific background colors (see below). Though thisreduction decreased the ecological validity of these expres-sions, drawing-like expressions are frequently used in variousexperiments which deal with facial expression processing.The use of a schematic faces revealed similar results on abehavioral level when compared to natural face images(Leppänen and Hietanen, 2004) and have been shown to elicitsimilar, though slightly reduced face-specific signals inventral visual processing regions (Halgren et al., 2000). Weonly used facial pictures which were correctly classified for atleast 50% in a four alternative forced-choice pre-evaluationstudy including 73 participants (unpublished diploma thesis,Weitzel, 2006, University of Bremen).

We introduced 105 emotional stimuli (53 female, 52 male)for each the negative, neutral and positive emotional facecategory. Fifteen pictures of each face category were used forthe first experimental run and 90 pictures were used in thesecond experimental run (see below). Faces were combinedwith a background color for each emotional category (see Fig.1). Specific colors have been shown to have differentemotional associations, but these associations seem to varybetween different contexts (Elliot and Maier, 2007). The color«red», for example, seems to signal danger in achievementtasks, but also seems to signal positive social emotions inrelational contexts (Elliot and Maier, 2007). Bright colors seemto elicit positive emotions, whereas dark colors elicit negativeemotions (Hemphill, 1996) and seem to negatively correlatewith arousal intensity (Valdez and Mehrabian, 1994). Whereas

the above mentioned studies tried to associate specific colorswith specific emotions, Ou et al. (2004) proposed a three-dimensional emotional color space with the dimension coloractivity, color heat and color weight by using a factor-analyticapproach. This approach does not associate specific colorswith specific emotions, but rather locates specific colors alongthis three-dimensional emotional color space. Interestingly,this three-dimensional color space is very similar to a two-dimensional model including pleasantness (valence) andactivation (arousal) as proposed by Russell (1979). Accordingto this two-dimensional model, negative and positive stimuliare both classified as highly arousing but differ in theirpleasantness.

Based on the data of the aforementioned studies onemotional associations of different colors, we choose back-ground colors which had similar valence and arousal proper-ties compared to the emotional expressions used in thepresent experiment. Two constraints guided the selection ofthe colors. First, we tried to find colors with similar brightnessbut with greatest possible distinctiveness in color hue. In thestandard face–color-combinations used in the present study,the red background color (Lab 84, 121, 121; according to theCIELAB color space) simultaneously presented with negativefacial expressions (fear) is described as color with high «coloractivity» and «color heat» (high arousal) as well as high «colorweight» (negative valence) (Ou et al., 2004) and is a powerfulcolor to signal danger (Elliot and Maier, 2007). Neutralexpressions were combined with a blue (Lab 84, 20, −126)and positive expressions with a yellow background color (Lab84, 1, 127). Blue colors are described as less arousing (Valdezand Mehrabian, 1994) and with minimal color heat andaverage color weight and color activity (Ou et al., 2004), andtherefore, closely resembles emotional neutrality. The coloryellow, especially with strong saturation, is described aspositively arousing (Valdez and Mehrabian, 1994) and withaverage color weight but with high color activity and colorheat (Ou et al., 2004). At no time of the experiment participantswere told that the background color has any relevance for theexperiment.

As a result of a series of pilot studies (see Section 4.4) wedecided not to counterbalance the color–emotion-associa-tion across subjects. These pilot studies revealed that onlythe emotion–color-combinations as used in the presentexperiment induced strong color–emotion-associations andconsequently reliable interference effects in the secondexperimental run. Therefore, interference effects in thepresent experiment are not simply related to the colorsthemselves but specifically induced by their emotionalassociation.

The stimuli were projected via a JVC video projector usingPresentation®-Software (Neurobehavioral Systems; https://nbs.neuro-bs.com) onto a projection screen positioned at therear end of the scannerwith a viewing distance of about 42 cm.Pictures subtended horizontal and vertically 15° of the visualangle. In the first run pictures were presented slightly abovethe middle of the screen and the scales for judging theemotional expression were presented beneath the pictures(1.5° distance, font «Arial» on black background, horizontal 19°and vertical 1°). In the second experimental run pictures werepresented in the center of the screen.

154 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

4.3. Trials and sequence

The aim of the first runwas to associate the background colorswith the emotional valence of the simultaneously presentedfacial expression. Colors were chosen which closely resemblethe valence and arousal properties of facial expressions tofacilitate the process of an emotional association (see Section4.4). Pictures in the first experimental run were presented in apseudo-randomized block order with blocks of 7–8 pictures ofone specific emotional valence. We used 15 pictures for eachof the three emotional categories. Each trial started with apicture displayed for 2000 ms. Subjects had to rate eachpicture with regard to intensity (arousal) of the depictedemotional expression (a scale with «strong», «middle», «weak»appeared underneath the face with each icon assigned to aresponse button on a three-button-mouse) and, thereafter,with regard to the emotional valence (scale «negative»,«neutral», «positive»). Each scale disappeared after 5000 msor after button press. Trials were presented with an ISI of 4000+/−200 ms and between trials a fixation point (.4× .4°) wasdisplayed. After 45 pictures we introduced a short 30 s break.Thereafter, all 45 pictures were presented a second time butwith a different order of presentation to reinforce theemotional association of the background colors. Subjectswere encouraged to perform as quickly and intuitively aspossible without too much deliberation.

For the second experimental run we used 90 pictures ofeach emotional category. In 90 trials – referred to as congruenttrials –weused the same emotion–color-combination as in thefirst experimental run. In 30 trials we changed the backgroundcolor (15 pictures per each of the two remaining backgroundcolors), thus resulting in a different emotion–color-combina-tion (incongruent trials). There were at least 82 trials betweenthe repetition of the same face included in a congruent andincongruent trial, thus preventing from any memory or faceidentity related effects.

The second run started with nine practice trials to allowaccommodation to the new task condition. Participants wereasked to make a fast, but still accurate forced-choicecategorization for each face with regard to its emotionalvalence (three-button-mouse: «negative» — index finger,«neutral» — middle finger and «positive» — ring finger). Thisrun consisted of two blocks with 180 trials each resulting in360 trials in total. Each picture was presented for 1400ms withan ISI of 3600+/−200ms (see Fig. 1). Pictures were presented ina randomized order with two constraints: (1) each blockstarted with 12 congruent trials without presenting anyincongruent trials to further allow the participants to accom-modate to the task, and (2), congruent trials followingincongruent trials did not present an emotional face withthe unattended background color of the preceding incongru-ent trial in order to avoid confounding effects of negativepriming (e.g., Tipper, 1985).

4.4. Pilot studies

We conducted three behavioral pilot studies using similarexperimental procedures as described in the former sections.These studies included twenty subjects each and wereconducted to find an appropriate experimental design for

the present fMRI study. In these pilot studies we firstcompared interference effects in experimental designs includ-ing either unbalanced (25:75) or balanced ratios (50:50) ofincongruent compared to congruent trials in the second run.Interference indices were calculated by comparing the reac-tion time difference between all incongruent and congruenttrials divided by the sum of both measures (INI=[incongruent−congruent] / [incongruent+congruent]). Positive values ofthis interference index indicate increased reaction times forincongruent compared to congruent trials. Interferenceindices were slightly but not significantly increased in theexperiment with an unbalanced (INI=.036) compared to abalanced (INI=.032) ratio of incongruent trials (t38<1). A one-way ANOVA including interference indices separately calcu-lated for negative, neutral and positive expressions revealedsignificant differences for the unbalanced (F2,38=3.80, p=.031)but not for to the balanced design (F2,38=2.32, p=.112). As wewere specifically interested in differential effects of contextualmodulation we decided to choose the unbalanced design forthe present study. This proportion of incongruent trials leftwas still sufficient for a valid and comparable averaging ofhemodynamic responses compared to congruent trials (seeHuettel and McCarthy, 2001).

Furthermore, a third pilot study was conducted to testwhether background colors will get associated with theemotional valence of the facial expression with which theywere combined during the first experimental run. Addition-ally, we were interested in the specificity of our standardface–color-combinations to induce specific emotional asso-ciations of the background colors which we assumed to befacilitated due to the emotional qualities of the colors («coloremotion»). Therefore, in a third experiment, we comparedthe face–color-combinations used in the second pilot studywithother face–color-combinations. This experiment includedtwo systematic permutations of the face–color-combinations,with the negative expressions, for example, now combinedwith the blue or the yellow background color during the firstexperimental run. We also included iterative valence judg-ments of the colors before, in-between and after the experi-ments. Valence ratings of colors were subjected to a 3 (time)×3(color) ANOVA. The results showed that only for the standardface–color-combinations used in the present experiment theemotional association of the background colors becamesignificantly stronger during the course of the experiment(factor time: F2,38=22.68, p<.001) when compared to the otherface–color-combinations (factor time: F2,18=2.35, p=.125 andF2,18=1.43, p=.265, respectively).

The data of this series of pilot studies showed that onlythe presently used face–color-combinations induced generaland substantial behavioral interference effects in thesecond run of our experiment. This was indicated bysignificantly increased interference indices for trials withthe standard (INI=.032) compared to permutated color trials(INI= .017; t38=3.75, p= .001). Therefore, we used thesestandard face–color-combinations instead of counterbalan-cing face–color-combinations across subjects.

Taken together, the results indicate that only for thestandard face–color-combination the colors gained a strongemotional association as intended by the experimentalprocedure. Thus, it was this emotional association which

155B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

induced interference effects in the second run. Assuming thatthe colors alone would have induced interference effects inthe second run we most likely would have found comparableinterference effects for all face–color-combinations.

4.5. Image acquisition

Imaging data were obtained on a 3-T SIEMENS MagnetomAllegra® System (Siemens, Erlangen, Germany) using a T2⁎-weighted gradient echo-planar imaging (EPI) sequence (44contiguous axial slices aligned to the AC-PC plane, slicethickness 3 mm, no gap, TR=2.5 s, TE=30 ms, FA=90°, 64×64matrix, FOV 192×192 mm, interleaved acquisition), and usinga manufacturer supplied circularly polarized head coil tomeasure changes in blood oxygenation level-dependent(BOLD) signals.

During the second run of the experiment we obtained 804volumes in total. The first 28 volumes containing taskinstructions and dummy trials were used to allow themagnetization to reach steady state and were discardedfrom functional analysis.

4.6. Image analysis

We used the statistical parametric mapping software SPM2(Welcome Department of Cognitive Neurology, London, UK)for preprocessing and statistical analysis of functional imagesand the MarsBar toolbox (http://marsbar.sourceforge.net) forestimating signal change parameters in regions of interest(ROI) analyses.

Functional images were first corrected for latency differ-ences in slice acquisition to the first slice in each image, andafter motion estimation realigned to the tenth image for eachdata set. Images were then normalized to the MontrealNeurological Institute (MNI) stereotactic EPI template usingnon-linear basis functions and resampled to 2mm3 voxel size.Normalized images were spatially smoothed using an iso-tropic Gaussian kernel of FWHM 8 mm3 to decrease differ-ences in individual structural brain anatomy and to increasethe signal-to-noise ratio. Each onset of stimulus presentationwas modeled by a canonical hemodynamic response function(HRF) and its temporal derivative. Images were high-passfiltered (128 s) to remove low-frequency signal drifts.

Nine experimental conditions (3 emotional expressions×3colors) were entered into a GLM as regressors. Only trials withcorrect responses were included, that means, trials wheresubjects responded correctly with respect to the emotionalvalence of the displayed expressions and within a timewindow of 150–2000 ms.

We furthermore added one regressor including erroneousor missed responses and dummy trials. Additionally, sixmotion correction parameters (resulting from the realignmentprocedure) as regressors of no interest were also included inthe designmatrix tominimize false positive activations due totask correlated motion (see Johnstone et al., 2006).

For the first level fixed effects analysis contrasts werespecified to examine effects of incongruence for each emo-tional category (i.e., [negative incongruent>negative congru-ent]). Furthermore, from all trials included in the secondexperimental run we randomly choose trials with identical

background color of incongruent trials (i.e., trials withidentical background colors of incongruent trials includingnegative expressions) and additionally compared them tocongruent trials in each category (i.e., congruent trialsincluding negative expressions). The latter contrasts wereused for masking the second level contrasts to eliminatefunctional activations which were simply due to differentbackground colors of incongruent compared to congruenttrials. Both the contrasts comparing incongruent and con-gruent trials in each emotional category as well as the colorcontrasts were entered into a second level analysis of variance(ANOVA). Comparisons of incongruent and congruent trialsin each emotional category were exclusively masked by therespective color contrast (p<.001, uncorrected, k=10; for boththe simple contrasts and the mask). To find regions ofdistinct activations for each incongruent condition in eachemotional category we performed an interaction analysis(p<.001, uncorrected, k=10) by a subtraction contrast com-paring the simple contrast of incongruence of one emotionalcategory with the simple contrasts of the two remainingemotional categories (i.e., [negative incongruent>negativecongruent]− {[neutral incongruent>neutral congruent] +[positive incongruent>positive congruent]}). Functional acti-vations revealed by the interaction analysis were alsoexclusively masked by the respective color contrast at thesame statistical threshold. An additional conjunction analy-sis (intermediate null hypothesis, p<.001, uncorrected, k=10)revealed regions of conjunct activation for the incongruencecontrasts across all emotional categories.

We additionally performed regions of interest (ROI) ana-lyses for two purposes. First, regions of significant activationin the conjunction analysis were re-entered on first level asfunctional ROI clusters to extract percent signal change foreach subject. We aimed to confirm significant increase ofpercent signal change (p<.05) in each incongruent conditioncompared to its respective congruent condition on group levelin each functional ROI. Secondly, we were particularlyinterested in activation profiles in specific regions related tothe task-relevant (facial expression) and the task-irrelevantstimulus dimension (color). Thus, we defined ROI spheres(radius 5 mm) around centers of masses. The ROIs resultedfrom combining ROI boxes corresponding to activationcoordinates reported in related studies (see below) and thefunctional activation clusters of the present study. Hence, we«aligned» coordinates in the literature to our specific activa-tion profile. We built ROIs for right hemispheric regions inoccipital area V4 (Talairach coordinates x=29, y=−76, z=−10)which is supposed to predominantly process color informa-tion (Bartels and Zeki, 2000). We additionally defined threeregions responsible for processing different aspects of emo-tional expressions: face-specific regions in right fusiformgyrus («fusiform face area» (FFA); x=40, y=−57, z=−16) andthe right inferior occipital gyrus («occipital face area» (OFA);x=34, y=−82, z=−11, for both see Kanwisher et al., 1997;Gauthier et al., 2000; Rossion et al., 2003; Grill-Spector et al.,2004; Steeves et al., 2005). Furthermore, we defined afunctional region in or around the right superior temporalsulcus (STS; x=48, y=56, z=5) (Puce et al., 1998; Hoffman andHaxby, 2000) which is supposed to process different aspects ofemotional face morphology.

156 B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

Acknowledgments

We would like to thank Sina A. Trautmann and CharlotteGiessmann for helpful comments on themanuscript and PeterErhard for helpful comments on data analysis. The study wasconducted as part of BMBF Neuroimaging program (01GO0202)from the Center for Advanced Imaging (CAI) — Magdeburg/Bremen (MH) and supported by the Center for CognitiveSciences (ISP) at Bremen University (SF).

R E F E R E N C E S

Allison, T., Puce, A., McCarthy, G., 2000. Social perception fromvisual cues: role of the STS region. Trends Cogn. Sci. 4, 251–291.

Aron, A.R., Robbins, T.W., Poldrack, R.A., 2004. Inhibition and theright inferior frontal cortex. Trends Cogn. Sci. 8, 170–177.

Banich, M.T., Milham, M.P., Atchley, R.A., Cohen, N.J., Webb, A.,Wszalek, T., Kramer, A.F., Liang, Z.-P., Barad, V., Gullett, D.,Shah, C., Brown, C., 2000. Prefrontal regions play a predominantrole in imposing an attentional »set«: evidence from fMRI. Cogn.Brain Res. 10, 1–9.

Bartels, A., Zeki, S., 2000. The architecture of the colour centre inthe human visual brain: new results and a review. Eur. J.Neurosci. 12, 172–193.

Behrmann, M., Geng, J.J., Shomstein, S., 2004. Parietal cortex andattention. Curr. Opin. Neurobiol. 14, 212–217.

Bunge, S.A., Ochsner, K.N., Desmond, J.E., Glover, G.H., Gabrieli, J.D.E., 2001. Prefrontal regions involved in keeping information inand out of mind. Brain 124, 2074–2086.

Bush, G., Luu, P., Posner, M.I., 2000. Cognitive and emotionalinfluences in anterior cingulate cortex. Trends Cogn. Sci. 4,215–222.

Carretié, L., Hinojosa, J.A., Mercado, F., Tapia, M., 2005. Corticalresponse to subjectively unconscious danger. NeuroImage 24,615–623.

Clark, V.P., Maisog, J.M., Haxby, J.V., 1998. An fMRI study of faceperception and memory using random stimulus sequences.J. Neurophysiol. 79, 3257–3265.

Compton, R.J., Banich,M.T.,Mohanty, A.,Milham,M.P., Herrington,J., Miller, G.A., Scalf, P.E., Webb, A., Heller, W., 2003. Payingattention to emotion: an fMRI investigation of cognitive andemotional Stroop tasks. Cogn. Affect. Behav. Neurosci. 3, 81–96.

Corbetta, M., Shulman, G.L., 2002. Control of goal-directed andstimulus-driven attention in the brain. Nat. Rev. Neurosci. 3,201–215.

Davies, T.N., Hoffman, D.D., 2003. Facial attention and spacetimefragments. Axiomathes 13, 303–327.

de Gelder, B., Morris, J.S., Dolan, R.J., 2005. Unconscious fearinfluences emotional awareness of faces and voices. Proc. Natl.Acad. Sci. U.S.A. 102, 18682–18687.

Desimone, R., Duncan, J., 1995. Neural mechanisms of selectivevisual attention. Annu. Rev. Neurosci. 18, 193–222.

Di Russo, F., Pitzalis, S., Spitoni, G., Aprile, T., Patria, F., Spinelli, D.,Hillyard, S.A., 2005. Identification of the neural sources of thepattern-reversal VEP. NeuroImage 24, 874–886.

Egner, T., Hirsch, J., 2005. Cognitive control mechanisms resolveconflict through cortical amplification of task-relevantinformation. Nat. Neurosci. 8, 1784–1790.

Egner, T., Etkin, A., Gale, S., Hirsch, J., 2008. Dissociable neuralsystems resolve conflict from emotional versus nonemotionaldistracters. Cereb. Cortex 18, 1475–1484.

Elliot, A.J., Maier, M.A., 2007. Color and psychological functioning.Curr. Dir. Psychol. Sci. 16, 250–254.

Etkin, A., Egner, T., Peraza, D.M., Kandel, E.R., Hirsch, J., 2006.Resolving emotional conflict: a role for the rostral anterior

cingulate cortex in modulating activity in the amygdala.Neuron 51, 1–12.

Gauthier, I., Tarr, M.J., Moylan, J., Skudlarski, P., Gore, J.C.,Anderson, W.A., 2000. The fusiform »face area« is part of anetwork that processes faces at the individual level. J. Cogn.Neurosci. 12, 495–504.

Green, M.J., Williams, L.M., Davidson, D., 2003. In the face ofdanger: specific viewing strategies for facial expressions ofthreat? Cogn. Emotion 17, 779–786.

Grill-Spector, K., Knouf, N., Kanwisher, N., 2004. The fusiform facearea subserves face perception not generic within-categoryidentification. Nat. Neurosci. 7, 1–8.

Halgren, E., Raij, T., Marinkovic, K., Jousmäki, V., Hari, R., 2000.Cognitive response profile of the human fusiform face area asdetermined by MEG. Cereb. Cortex 10, 69–81.

Hemphill, M., 1996. A note on adults' color-emotion association.J. Genet. Psychol. 157, 275–280.

Hoffman, E.A., Haxby, J.V., 2000. Distinct representation of eyegaze and identity in the distrubuted neural system for faceperception. Nat. Neurosci. 3, 80–84.

Huettel, S., McCarthy, G., 2001. The effects of single-trial averagingupon the spatial extent of fMRI activation. NeuroReport 12,1–6.

Ishai, A., Pessoa, L., Bikle, P.C., Ungerleider, L.G., 2004. Repetitionsuppression of faces ismodulated by emotion. Proc. Natl. Acad.Sci. U.S.A. 101, 9827–9832.

Ishihara, S., 1974. Tests for Color-Blindness. Kanehara ShuppenCompany, Ltd., Tokyo.

Itier, R., Taylor, M.J., 2004. Source analysis of the N170 to faces andobjects. NeuroReposrt 15, 1261–1265.

Johnstone, T., Walsh, K.S.O., Greischar, L.L., Alexander, A.L., Fox,A.S., Davidson, R.J., Oakes, T.R., 2006. Motion correction and theuse of motion covariates in multiple-subject fMRI analysis.Hum. Brain Mapp. 27, 779–788.

Kanwisher, N., McDermott, J., Chun, M.M., 1997. The fusiform facearea: a module in human extrastriate cortex specialized forface perception. J. Neurosci. 17, 4302–4311.

Kim, H., Somerville, L.H., Johnstone, T., Alexander, A.L., Whalen,P.J., 2003. Inverse amygdala and medial prefrontal cortexresponses to surprised faces. NeuroReport 14, 2317–2322.

Kim, H., Somerville, L.H., Johnstone, T., Polis, S., Alexander, A.L.,Shin, L.M., Whalen, P.J., 2004. Contextual modulation ofamygdala responsivity to surprised faces. J. Cogn. Neurosci. 16,1730–1745.

Kiyosawa, M., Inoue, C., Kawasaki, T., Tokoro, T., Ishii, K., Ohyama,M., Senda, M., Soma, Y., 1996. Functional neuroanatomy ofvisual object naming: a PET study. Graef. Arch. Clin. Exp. 234,110–115.

Krekelberg, B., Vatakis, A., Kourtzi, Z., 2005. Implied motion fromform in the human visual cortex. J. Neurophysiol. 94,4373–4386.

Lang, P.J., Bradley, M.M., Fitzsimmons, J.R., Cuthbert, B.N.,Scott, J.D., Moulder, B., Nangia, V., 1998. Emotional arousaland activation of the visual cortex: an fMRI analysis.Psychophysiology 35, 199–210.

Lavie, N., 2005. Distracted and confused? Selective attention underload. Trends Cogn. Sci. 9, 75–82.

Le, T.H., Pardo, J.V., Hu, X., 1998. 4-T-fMRI study of nonspatialshifting of selective attention: cerebellar and parietalcontributions. J. Neurophysiol. 79, 1535–1548.

Lee, T.-W., Josephs, O., Dolan, R.J., Critchley, H.D., 2006. Imitatingexpressions: emotion-specific neural substrates in facialmimicry. SCAN 1, 122–135.

Leppänen, J.M., Hietanen, J.K., 2004. Emotionally positive facialexpressions are processed faster than negative facialexpressions but why? Psychol. Res. 69, 22–29.

Meeren, H.K.M., van Heijnsbergen, C.C.R.J., de Gelder, B., 2005.Rapidperceptual integration of facial expression and emotionalbody language. Proc. Natl. Acad. Sci. U.S.A. 102, 16518–16523.

157B R A I N R E S E A R C H 1 2 6 9 ( 2 0 0 9 ) 1 4 3 – 1 5 7

Nachev, P., Rees, G., Parton, A., Kennard, C., Husain, M., 2005.Volition and conflict in humanmedial frontal cortex. Curr. Biol.15, 122–128.

Nakamura, K., Kawashima, R., Ito, K., Sugiura, M., Kato, T.,Nakamura, A., Datano, K., Nagumo, S., Kubota, K., Fukuda, H.,Kojima, S., 1999. Activation of the right inferior frontal cortexduring assessment of facial emotion. J. Neurophysiol. 82,1610–1614.

Narumoto, J., Okada, T., Sadato, N., Fukui, K., Yonekura, Y., 2001.Attention to emotion modulates fMRI activity in human rightsuperior temporal sulcus. Brain Res. Cogn. Brain Res. 12,225–231.

Nomura, M., Ohira, H., Haneda, K., Iidaka, T., Sadato, N., Okada, T.,Yonekura, Y., 2004. Functional association of the amygdala andventral prefrontal cortex during cognitive evaluation of facialexpressions primed by masked angry faces: an event-relatedfMRI study. NeuroImage 21, 352–363.

Oldfield, R.C., 1971. The assessment and analysis of handedness:the Edinburgh inventory. Neuropsychologia 9, 97–113.

Ou, L.-C., Luo, M.R., Woodcock, A., Wright, A., 2004. A study of coloremotion and color preference. Part I: Color emotions for singlecolors. Color Res. Appl. 29, 232–240.

Pizzagalli, D.A., Lehmann, D., Hendrick, A.M., Regard, M.,Pascual-Marqui, R.D., Davidson, R.J., 2002. Affective judgmentsof faces modulate early activity (∼160 ms) within the fusiformgyri. NeuroImage 16, 663–677.

Proverbio, A.M., Brignone, V., Matarazzo, S., Del Zotto, M., Zani, A.l.,2006. Gender differences in hemispheric asymmetry for faceprocessing. BMC Neurosci. 7, 44.

Puce, A., Allison, T., Bentin, S., Gore, J.C., McCarthy, G., 1998.Temporal cortex activation in humans viewing eye and mouthmovement. J. Neurosci. 18, 2188–2199.

Righart, R., de Gelder, B., 2006. Context influences early perceptualanalysis of faces— an electrophysiological study. Cereb. Cortex16, 1249–1257.

Rossion, B., Caldara, R., Seghier, M., Schuller, A.-M., Lazeyras, F.,Mayer, E., 2003. A network of occipito-temporal face-sensitiveareas besides the right middle fusiform gyrus is necessary fornormal face processing. Brain 126, 1–15.

Russell, J.A., 1979. Affective space is bipolar. J. Pers. Soc. Psychol.37, 345–356.

Schiltz, C., Sorger, B., Caldara, R., Ahmed, F., Mayer, E., Goebel, R.,Rossion, B., 2006. Impaired face discrimination in acquiredprosopagnosia is associated with abnormal response toindividual faces in the right middle fusiform gyrus. Cereb.Cortex 16, 574–586.

Serences, J.T., Liu, T., Yantis, S., 2005. Parietal mechanisms ofattentional control: locations features and objects. In: Itti, L.,Rees, G., Tsotsos, J. (Eds.), Neurobiology of Attention. Elsevier,San Diego, CA, pp. 35–41.

Steeves, J.K.E., Culham, J., Duchaine, B.C., Cavina Pratesi, C.,Valyear, K.F., Schindler, I., Humphrey, G.K., Milner, A.D.,Goodale, M.A., 2005. The fusiform face area is not sufficientfor face recognition: evidence from a patient with denseprosopagnosia and no occipital face area. Neuropsychologia44, 594–609.

Talairach, J., Tournoux, P., 1988. Co-planar Stereotaxic Atlas of theHuman Brain.. Georg Thieme Verlag, Stuttgart, Germany.

Tempini, M.L., Price, C.J., Josephs, O., Vandenberghe, R., Cappa,S.F., Kapur, N., Frackowiak, R.S., 1998. The neural systemssustaining face and proper-name processing. Brain 121,2103–2118.

Tipper, S.P., 1985. The negative priming effect: inhibitory primingby ignored objects. Q. J. Exp. Psychol. A 37, 571–590.

Thompson, K.G., Biscoe, K.L., Sato, T.R., 2005. Neuronal basis ofcovert spatial attention in the frontal eye field. J. Neurosci. 25,9479–9487.

Ullsperger, M., von Cramon, D.Y., 2004. Neuroimaging ofperformance monitoring: error detection and beyond. Cortex40, 593–604.

Vaina, L.M., Gryzwacz, N.M., Saiviroonporn, P., LeMay, M.,Bienfang, D.C., Cowey, A., 2003. Can spatial and temporalmotion integration compensate for deficits in local motionmechanisms? Neuropsychologia 41, 1817–1836.

Valdez, P., Mehrabian, A., 1994. Effects of color on emotions. J. Exp.Psychol. Gen. 123, 394–409.

Van Essen, D.C., Dickson, J., Harwell, J., Hanlon, D., Anderson, C.H., Drury, H.A., 2001. An integrated software suite forsurface-based analyses of cerebral cortex. J. Am. Med.Inform. Assoc. 41, 1359–1378.

Vuilleumier, P., Armony, J.L., Driver, J., Dolan, R.J., 2001. Effects ofattention and emotion on face processing in the human brain:an event-related fMRI study. Neuron 30, 829–841.

Vuilleumier, P., Richardson, M.P., Armony, J.L., Driver, J., Dolan, R.,2004. Distant influences of amygdala lesion on visual corticalactivation during emotional face processing. Nat. Neurosci. 7,1271–1278.

Wager, T.D., Phan, K.L., Liberzon, I., Taylor, S.F., 2003. Valence,gender, and lateralization of functional brain anatomy inemotion: a meta-analysis of findings from neuroimaging.NeuroImage 19, 513–531.

Whalen, P.J., 1999. Fear, vigilance, and ambiguity: initialneuroimaging studies of the human amygdala. Curr. Dir.Psychol. Sci. 7, 177–187.

Weitzel, V., 2006. Teststatistische Evaluation von Stimuli zuremotionalen Gesichtererkennung. University of Bremen,unpublished diploma thesis.

Williams, L.M., Senior, C., David, A.S., Loughland, C.M., Gordon, E.,2001. In search of the ‘Duchenne’ smile: evidence from eyemovements. J. Psychophysiol. 15, 122–127.