19
Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions Lotte F. van Dillen 1 , Lasana T. Harris 2 , Wilco W. van Dijk 1 , and Mark Rotteveel 3,4 1 Institute of Psychology, Leiden Institute for Brain and Cognition, Leiden University, Leiden, the Netherlands 2 Institute of Psychology, Duke University, Durham, NC, USA 3 Institute of Psychology, University of Amsterdam, Amsterdam, the Netherlands 4 Cognitive Science Center Amsterdam, University of Amsterdam, Amsterdam, the Netherlands In the present research we examined whether the psychological meaning of peoples categorisation goals affects facial muscle activity in response to facial expressions of emotion. We had participants associate eye colour (blue, brown) with either a personality trait (extraversion) or a physical trait (light frequency) and asked them to use these associations in a speeded categorisation task of angry, disgusted, happy and neutral faces while assessing participantsresponse times and facial muscle activity. We predicted that participants would respond differentially to the emotional faces when the categorisation criteria allowed for inferences about a targets thoughts, feelings or behaviour (i.e., when categorising extraversion), but not when these lacked any social meaning (i.e., when categorising light frequency). Indeed, emotional faces triggered facial reactions to facial expressions when participants categorised extraversion, but not when they categorised light frequency. In line with this, only when categorising extraversion did participantsresponse times indicate a negativity bias replicating previous results. Together, these findings provide further evidence for the contextual nature of peoples selective responses to the emotions expressed by others. Keywords: Categorisation goal; Emotional expression; Facial EMG; Reaction times; Mimicry. Naïve theories of social perception help people form impressions by making available information about what others might be like while allowing missing information to be filled in. If a person is observed to have one particular trait, people assume that this person has other related traits. Correspondence should be addressed to: Lotte F. van Dillen, Institute of Psychology/Leiden Institute for Brain and Cognition, Leiden University, P.O. Box 9555, 2300 RB Leiden, the Netherlands. E-mail: [email protected]. We thank Erica Hornstein and Thijs Schrama for their assistance during data collection and data analysis, and we thank Elizabeth Phelps for facilitating this research. This research was supported by a Netherlands Organisation for Scientific Research (NWO) [grant number 400-08-128]. © 2014 Taylor & Francis 1 COGNITION AND EMOTION, 2014 http://dx.doi.org/10.1080/02699931.2014.982514

Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

  • Upload
    mark

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

Looking with different eyes: The psychological meaningof categorisation goals moderates facial reactivity to

facial expressions

Lotte F. van Dillen1, Lasana T. Harris2, Wilco W. van Dijk1, andMark Rotteveel3,4

1Institute of Psychology, Leiden Institute for Brain and Cognition, Leiden University, Leiden, theNetherlands2Institute of Psychology, Duke University, Durham, NC, USA3Institute of Psychology, University of Amsterdam, Amsterdam, the Netherlands4Cognitive Science Center Amsterdam, University of Amsterdam, Amsterdam, the Netherlands

In the present research we examined whether the psychological meaning of people’s categorisationgoals affects facial muscle activity in response to facial expressions of emotion. We had participantsassociate eye colour (blue, brown) with either a personality trait (extraversion) or a physical trait (lightfrequency) and asked them to use these associations in a speeded categorisation task of angry,disgusted, happy and neutral faces while assessing participants’ response times and facial muscleactivity. We predicted that participants would respond differentially to the emotional faces when thecategorisation criteria allowed for inferences about a target’s thoughts, feelings or behaviour (i.e., whencategorising extraversion), but not when these lacked any social meaning (i.e., when categorising lightfrequency). Indeed, emotional faces triggered facial reactions to facial expressions when participantscategorised extraversion, but not when they categorised light frequency. In line with this, only whencategorising extraversion did participants’ response times indicate a negativity bias replicating previousresults. Together, these findings provide further evidence for the contextual nature of people’s selectiveresponses to the emotions expressed by others.

Keywords: Categorisation goal; Emotional expression; Facial EMG; Reaction times; Mimicry.

Naïve theories of social perception help people

form impressions by making available information

about what others might be like while allowing

missing information to be filled in. If a person is

observed to have one particular trait, people

assume that this person has other related traits.

Correspondence should be addressed to: Lotte F. van Dillen, Institute of Psychology/Leiden Institute for Brain and Cognition,

Leiden University, P.O. Box 9555, 2300 RB Leiden, the Netherlands. E-mail: [email protected].

We thank Erica Hornstein and Thijs Schrama for their assistance during data collection and data analysis, and we thank Elizabeth

Phelps for facilitating this research.

This research was supported by a Netherlands Organisation for Scientific Research (NWO) [grant number 400-08-128].

© 2014 Taylor & Francis 1

COGNITION AND EMOTION, 2014

http://dx.doi.org/10.1080/02699931.2014.982514

Page 2: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

These associations help people to categorise othersand to predict their behaviour (Hugenberg &Bodenhausen, 2004). Accordingly, social-cognitiveprocesses should be activated when observers ofemotional faces use a category for which emotionalexpressions are psychologically meaningful (e.g., apersonality trait, which allows for inferences abouta person’s thought, feelings or behaviour). Fornon-social categories that do not prompt inferencesabout people’s motivations, such as the strictlyphysical category of light frequency absorbed bythe eye (eye colour), emotional information con-tained in facial expressions has no informationalvalue and should therefore not employ social-cognitive processes readily. In the present researchwe tested whether the psychological meaning ofcategorisation goals moderates differential engage-ment of a social-cognitive process: facial muscleactivity in response to facial displays of emotion.We will refer to this process as facial reactivity.

There is a growing literature on contextualinfluences on facial reactivity (Cannon, Hayes, &Tipper, 2009; Hess, Philippot, & Blairy, 1998;see, Hess & Fischer, 2013, for a review). Thesefindings suggest that facial reactivity serves socialfunctions, such that any factor that manipulatesthe social implications of the task should influencethe extent that facial reactivity occurs. Indeed, notonly motivational factors such as attitude towardsthe target (Likowski, Muhlberger, Seibt, Pauli, &Weyers, 2008) and group membership of thetarget (Bourgeois & Hess, 2008), but also whetherpeople explicitly name the emotional expression oftarget faces (e.g., happy, angry) or their—arbitrary—colour (e.g., blue, green; Cannon, Hayes, &Tipper, 2009), all modulate the extent to whichfacial reactivity occurs. The current paper utilisesthis literature to demonstrate that psychological(social) meaning can modulate facial reactivityeven when people base their categorisations onthe exact same realistic feature of a target face (i.e.,eye colour). We expect facial muscle responses tostill vary depending on whether this singularfeature is linked to a psychologically meaningfulor mere physical dimension.

Facial expressions can communicate informa-tion about a person’s intentions and are therefore

essential for social cognition (Hess, Sabourin, &Kleck, 2007). When a person displays a certainemotion, they communicate information about theirinterpretation of an event, their behavioural inten-tions and even their disposition (Hess & Fischer,2013). In turn, people have been found to base theirjudgements of trustworthiness, dominance andcompetence on facial appearance (Oosterhof &Todorov, 2008; Thornhill & Gangestad, 1999).Through social learning or stereotyping, people maycome to associate specific facial expressions withspecific categories. For example, people incorporateemotional expressions in their categories of race(Hugenberg & Bodenhausen, 2004), gender(Becker, Kenrick, Neuberg, Blackwell, & Smith,2007) and personality (Knutson, 1996).

Theories of social monitoring, moreover, sug-gest that people are constantly vigilant to socialcues of rejection and acceptance when scanningtheir environment (Pickett & Gardner, 2005;Pickett, Gardner, & Knowles, 2004). Consistentwith these theories, recent evidence suggests thatthere exists in the human brain a large overlapbetween the default system (i.e., resting stateactivity), the system for emotion processing andthe system for social cognition (see Schilbach et al.,2012, for a meta-analysis). There may, thus, exista shared neural network that underlies emotionprocessing, social cognition and unconstrainedcognition that allows for the quick detection andevaluation of social-emotional cues.

For example, when categorising emotionaltarget faces, people’s attention is usually, thoughnot always, biased towards processing of thethreatening expression of the faces. As a result,participants are slower to respond to threateningcompared to neutral expressions on a speededface-categorisation task (i.e., reflecting a negativitybias, Van Dillen & Derks, 2012; Van Dillen &Koole, 2009; Van Honk, Tuiten, De Haan, Vanden Hout, & Stam, 2001). This expression-specific response time interference, moreover, isaccompanied by greater amplitudes on attention-specific components of the event-related potentialto angry, compared to happy, faces, as early as 200milliseconds following stimulus onset (Van Dillen& Derks, 2012). As such, the negativity bias

VAN DILLEN ET AL.

2 COGNITION AND EMOTION, 2014

Page 3: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

provides an important indication that people selec-tively attend to the emotional expression of a target.

Recently, Van Dillen, Lakens, and Van denBos (2011) used this notion to test the hypothesisthat this negativity bias may depend on people’scategorisation goals. They based their hypothesison previous findings that suggest that contextualfactors may modulate categorisation responses toemotional targets in line with current task require-ments (Dreisbach & Haider, 2009; Van Dillen &Koole, 2009; see also Moriya, Koster, & DeRaedt, in press). Although, by default, threateningfaces capture and/or hold attention quickly andunintentionally, these expressions need not alwaysbias cognitive and affective processes triggered bythe face to the same degree (Zhou & Liu, 2013).Instead, vigilance for threatening faces may besubject to contextual variations in accord withpeople's current task goals, such that when anactivated category becomes more stringent, irrel-evant features, social, emotional or otherwise, areless likely to be processed (Dreisbach & Haider,2009; Wessa, Heissler, Schönfelder, & Kanske,2013; Zhou & Liu, 2013).

In one study for example (Van Dillen et al.,2011; Study 3), participants quickly categorisedangry and neutral faces on the basis of eye colour.The critical manipulation in this study associatedeye colour with either a personality category(extraversion) or a physical category (light fre-quency). Thus, participants labelled the faces aseither introverted or extraverted, or as having eyesabsorbing high- or low-light frequency as quicklyas possible. Accordingly, participants always usedeye colour to categorise the faces, but this featurehad a psychological meaning (personality) orsimply indicated a physical property in the absenceof any social meaning. Categorisation responses toangry faces were slower compared to neutral faceswhen eye colour was associated with a psycholo-gically meaningful category (i.e., the personalitycategory: extraversion), but not when eye colourwas associated with a category that was notpsychologically meaningful (i.e., the physicalcategory: light frequency). Perhaps then the neg-ativity bias is likely to only occur when peopleengage social-cognitive processing of the face.

Categorising based on the psychological mean-ing of the angry face requires mental stateinferences about the thoughts, feelings and pos-sible behaviour of the person making that facialdisplay (social-cognitive processing), but categor-ising based on light frequency does not. Peopletypically incorporate knowledge about facial ex‐pressions in their mental concepts of personalitytraits (e.g., Knutson, 1996), but people typically donot incorporate knowledge about facial expressionsin their mental concept of light frequency. Thus,when viewing target faces with varying expressionsduring the extraversion categorisation task, peoplewill likely engage social cognition. When viewingthose targets during the light frequency categor-isation task, people will therefore likely not engagesocial cognition.

In an extension to the work of Van Dillen et al.(2011), the current study uses facial electromyo-graphy (EMG) in addition to examining partici-pants’ response times to the face stimuli toexamine the underlying social-cognitive processesfurther. Psychophysical measurement providesinsight into overt and covert facial expressions ofemotion (see Fridlund & Izard, 1983, for areview). As such, it allows access to affectiveresponses below conscious awareness that partici-pants may not be able or willing to report. FacialEMG reflects the firing of motor units, enablingthis measure to detect minute muscle movements(Tassinary, Cacioppo, & Vanman, 2007), andthereby providing emotion-specific informationabout the underlying process that could not beobtained with reaction time measurements(see Cacioppo & Petty, 1979, 1981; Dimberg,1982, for early examples). Most importantly fornow, facial reactivity, as measured via EMG, canbe seen as an index of social cognition, as it hasbeen found to positively relate to emotion recog-nition (Balconi, Bortolotti, & Gonzaga, 2011)and dispositional empathy (Dimberg, Andréasson,& Thunberg, 2011; Harrison, Morgan, &Critchley, 2010), as well as to both physiologicalindices of emotion (de Sousa et al., 2011; Helleret al., 2011) and subjective reports (Dimberg et al.,2011).

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 3

Page 4: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

In contrast, response time measures representthe cumulative outcome of a set of cognitive (andother) processes performed on stimuli of interest,but are not, themselves, direct reflections of thoseprocesses (Bartholow, 2010). Although previousresponse time findings suggest that people selec-tively engage social cognition to facial expressionsof emotion when categorising a personality traitversus a physical trait, it is unclear whether thisimplies that people process these facial expressionsmore readily when the goal of the task is psycho-logically meaningful versus when it is not. Forexample, it could well be that people respond moreslowly to negative (e.g., angry) expressions com-pared to neutral expressions during the categorisa-tion of a psychologically meaningful dimensionbecause they elaborate more on whether theseexpressions match the proposed category thanduring the categorisation of a physical dimensionsuch as light frequency. This issue is, at least inpart, circumvented when using facial EMG meth-odology, which allows for the direct measurementof facial muscle activity in response to specifictargets. By integrating facial EMG into ourparadigm, we aim to obtain additional evidenceindicative of social-cognitive processing as re‐flected in facial reactivity.

In addition, we now include as emotionalstimuli not only angry faces but also disgustedand happy faces, as these expressions can be linkedto distinct facial muscle activity patterns (Dim-berg, 1982; Vrana, 1993). Several studies, more-over, point to the notion that both expressions ofanger and disgust communicate physical and/orsocial threat (e.g., Burklund, Eisenberger, &Lieberman, 2007). Including both anger anddisgust faces allows us to examine whether thepreviously observed effects of categorisation goalson the negativity bias are driven by emotion-specific responses to facial expressions (as reflectedby facial muscle reactivity to facial displays ofanger and disgust) or simply reflect greater elab-oration on the facial stimuli during the applicationof a personality category compared to a physicalcategory (in which case, we should observeenhanced corrugator supercilii activity in response

to the faces, regardless of emotional expression; seeYartz & Hawk, 2002, for a similar approach).

We compared facial muscle reactivity duringthe personality and physical categorisation taskwith facial muscle reactivity during a controlcondition in which participants passively viewedthe emotional faces. Previous research has foundthat under conditions of passive viewing, peopleautomatically display facial muscle responses inresponse to facial displays of emotion (Dimberg,1982). As we have addressed already in thebeginning, several findings point to the notionthat people by default engage in social informationprocessing such as emotional cues (Pickett &Gardner, 2005; Pickett et al., 2004; Schilbachet al., 2012), unless specific task situations demandexplicitly otherwise. It has been suggested, more-over, that people’s facial reactions are importantfor assessing the emotional significance of certaininformation (Decety & Chaminade, 2003; Nie-denthal, Winkielman, Mondillon, & Vermeulen,2009; Schilbach, Eickhoff, Mojzisch, & Vogeley,2008). Therefore, we expected facial reactivity todepend on the activated category, with comparablemuscle reactivity to facial displays of emotionduring the personality categorisation task as duringthe passive viewing task, but no facial musclereactivity in the absence of social-cognitive pro-cesses during the physical categorisation task. Wethus compared facial reactivity during the categor-isation tasks with a passive viewing control condi-tion, which we considered as the defaultcondition. We then tested whether the twodifferent categorisation tasks would result in sim-ilar, enhanced or reduced, facial muscle responsesto emotional target faces compared to this controlcondition.

As an indication of facial reactivity, weexpected, compared to the other target expres-sions, greater levator labii superioris activity inresponse to disgusted faces (Vrana, 1993), greateractivity for the corrugator supercilii brow muscle inresponse to angry faces and greater activation ofthe zygomaticus major cheek muscle in responseto happy faces (Dimberg & Thunberg, 1998).Based on our previous findings (Van Dillen et al.,2011), we expected this facial reactivity to occur

VAN DILLEN ET AL.

4 COGNITION AND EMOTION, 2014

Page 5: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

when participants categorised these faces on thepsychologically and socially meaningful dimen-sions of extraversion, or during the passive viewingcontrol condition, but not when they categorisedthese faces on the non-social and psychologicallymeaningless dimension of light frequency.

METHOD

Participants and design

Seventeen paid volunteers at New York University(eight females, Mage = 22 years, SDage = 1.80)participated in the experiment in exchange forpayment ($10). The experimental design was a 4(expression: angry, disgust, happy, neutral) × 2(categorisation goal: personality, physical) within-participants design with facial muscle activity(levator labii, corrugator supercilii, zygomaticusmajor) as an additional factor for the EMG dataanalyses. In addition, for the EMG data analyses,we included a passive viewing control conditionthat functioned as a baseline comparison, toexamine spontaneous facial muscle reactivity.Main dependent variables were response times onthe categorisation task and facial EMG activity inresponse to the target faces.

A power analysis using the Gpower computerprogram (Faul, Erdfelder, Lang, & Buchner,2007) indicated that a total sample of 16 partici-pants would be needed to detect small effects (d =.15, comparable to the effects observed in VanDillen et al., 2011) with 85% power using theproposed repeated measures analyses of variancewith alpha set at .05. Due to technical problems,we had to discard the data of three participantsfrom our original sample of twenty participantsresulting in a final sample size of 17. We reporthow we determined all data exclusions, all manip-ulations and all measures in the study descriptionprovided below.

PROCEDURE AND EQUIPMENT

To provide a meaningful context for our experi-ment, participants were first asked to read an

online text that would be relevant for the sub-sequent task. The text described that people’s eyecolour indicates their personality and suggestsamong other things that blue-eyed individuals aremore introverted, whereas brown-eyed individualsare more extraverted (see for the full text: http://www.medindia.net/news/healthwatch/eye-colour-and-personality-29590-1.htm). After reading theonline text, the relation between extraversion andeye colour was repeated to ensure that participantshad accurately perceived this information. Follow-ing the procedure of Van Dillen et al. (2011),participants then read a Wikipedia page thatinformed them that eye colour reflects the lightfrequency being absorbed, with blue eyes absorb-ing long frequencies and brown eyes absorbingshort frequencies (see http://en.wikipedia.org/wiki/Eye_color).

Next, participants were asked to use the aboveinformation in a categorisation task of blue-eyedand brown-eyed faces displaying an angry, dis-gusted, happy or neutral expression. Contrary toVan Dillen et al. (2011), who varied the nature ofthe categorisation task between participants, allparticipants categorised the faces on the basis ofextraversion (personality categorisation goal) andlight frequency (physical categorisation goal). In athird control block, all participants passivelyviewed the faces allowing us to compare facialmuscle responses during both categorisation taskswith baseline reactivity. The experiment thusconsisted of three blocks (personality, physical,passive viewing). We randomised the order of thethree blocks and stimulus presentation within eachblock (angry, disgusted, happy and neutral faces).At the beginning of each block, participants wereinformed about the nature of the task they weregoing to perform with the instructions: ‘now youwill decide for each face you see whether it isextroverted or introverted/ its eyes absorb long orshort light frequency/ simply watch the faces’.Participants could start each block at their ownpace by pressing the ENTER button.

Each block consisted of 80 trials, and each trialwas announced by a row of asterisks (****) in thecentre of the screen for one second. During eachtrial, a picture of a blue-eyed or brown-eyed angry,

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 5

Page 6: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

disgusted, happy or neutral face was displayed fortwo seconds. In the personality categorisation task,participants decided via a keyboard response (‘a’and ‘b’ key; counterbalanced between participants)whether the face was extraverted or introverted. Inthe physical categorisation task, participants usedthe same keys (again counterbalanced betweenparticipants) to decide whether the eyes of theface absorbed high- or low-light frequency. Parti-cipants’ responses and response times wererecorded unobtrusively.

At the end of the experiment, participants werethanked, debriefed and paid. As part of thedebriefing, we posed a series of questions thatcomprised our manipulation check to test whetherparticipants had successfully associated the differ-ent eye colours with extraversion and light fre-quency. Participants read four statements aboutthe information provided at the beginning of theexperiment and indicated whether these were‘True’ or ‘False’. These statements were: Peoplewith green eyes are more likely to live in cities(False), Madame Curie won the Nobel Prize andhad blue eyes (True) and the critical statements:Individuals with brown eyes are considered to bemore extraverted than individuals with blue eyes(True) and Blue eyes absorb more short-frequencylight, whereas brown eyes absorb more long-frequencylight (False). We next explained to the participantsthat whereas the provided information was real, itwas only partially supported by scientific research,such that whereas there is indeed a relationshipbetween eye colour and absorbed light frequency,the relationship between eye colour and personal-ity as provided in the online article, and extraver-sion more specifically, is not supported byscientific evidence.

Stimulus faces were drawn from the KarolinskaDirected Emotional Faces (KDEF) database(Lundqvist, Flykt, & Öhman, 1998), whichdepicts pictures of facial expressions posed byactors. The KDEF database has been widelyused (Lundqvist, Flykt, & Öhman, 1998) andhas been validated (Goeleven, De Raedt, Leyman,& Verschuere, 2008). We selected pictures of 20faces (10 males and 10 females) looking directlyinto the camera and displaying an angry,

disgusted, happy or neutral expression. Of thesefaces, we created a blue-eyed and a brown-eyedversion by overlaying a semi-transparent filtermatching the contour of the eyes (blue-eyed:RGB = 0, 102, 204; brown-eyed: RGB = 204,102, 0; transparency 15%; see Supplementarymaterial). Accordingly, the total set consisted of80 pictures: 10 male and 10 female faces witheither blue or brown eyes, displaying each of thefour expressions. The set was displayed three times(once during each block).

EMG recordings

Facial muscle activity was recorded from thelevator labii superioris (adjacent to nose), thecorrugator supercilii (brow) and the zygomaticusmajor (cheek) regions using bipolar Ag/AgClelectrodes on the left side of the face (cf. Fridlund& Cacioppo, 1986).

EMG activity was recorded using a BIOPACsystem at a sampling frequency of 2000 Hz. RawEMG data were filtered online with a high-passfilter at 10 Hz, a low-pass filter at 500 Hz, a notchfilter at 50 Hz and were amplified 5000×. Thesedata were then integrated, rectified and recordedonto the hard disk for offline analysis. Data wereconverted into z-scores for participants and musclesites. Change scores were calculated using the final500 ms of the fixation screen as a baseline,averaged and subtracted from the trial data. Inaddition, we explored the time course of the EMGresponses by dividing the response during stimuluspresentation (2000 ms) into four epochs of 500ms. We first examined time as a factor and thenconducted our main analyses separately for eachepoch. We did not observe any effects of time(main or interaction), and analyses of the first 500ms following onset revealed no effects of ourmanipulations of categorisation goal and targetexpression (see also Figure 1). Hence, we collapsedaverage EMG activity of the other three epochsand focused our analyses of the effects of categor-isation goal and target expression on the remain-ing time window of stimulus presentation (500–2000 ms).

VAN DILLEN ET AL.

6 COGNITION AND EMOTION, 2014

Page 7: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

Our decision to exclude the first 500 ms wasbased on the observation, both in the literature(e.g., Dimberg, 1996; Dimberg & Petterson, 2000)and in our own data, that, the corrugator superciliimuscle displays an orientation response when astimulus appears on screen, which is comparableacross target stimuli and which masks emotion-specific responses. See for example our Figure 1(b)for an illustration of this effect. Because we wantedto analyse the three muscles in one model,we excluded the first 500 ms of the levator labiimuscle and the zygomaticus major muscle aswell (as has been done previously, Dimberg &Petterson, 2000; Dimberg & Thunberg, 1998).Importantly, although the effects were generallysmaller, including the first 500 ms did not affectour pattern of results.

RESULTS

Manipulation check

All participants judged all four statements aboutthe provided information correctly suggesting thatthey had accurately associated eye colour withextraversion and light frequency.

Categorisation responses

Analysis of error rates (M = 5%, SD = 2%) did notyield any effects on expression or categorisationgoal. Therefore, no indication was obtained for aspeed-accuracy trade-off in the responses. Partici-pants rather accurately associated blue eyes withintroversion and absorbing long-light frequencyand brown eyes with extraversion and absorbingshort-light frequency.

We excluded from further analyses reactiontimes (4%) that exceeded three standard deviationsfrom the individual means. Response times werenot skewed. Greenhouse–Geiser corrections wereperformed when applicable.

We hypothesised that although, typically, peo-ple are slower when categorising threateningcompared to happy or neutral faces, this negativitybias depends on people’s categorisation goals.Categorisation responses to angry and disgustedfaces should hence be slower compared to neutral

or happy faces when eye colour is associated with apsychologically meaningful category (i.e., the per-sonality category: extraversion), but not when eyecolour is associated with a psychologically mean-ingless category (i.e., the physical category: lightfrequency).

To support this hypothesis we should find asignificant two-way interaction between expressionand categorisation goal. A General Linear Model(GLM) repeated measures analysis of responsetimes with the within-participants factors expres-sion (angry, disgust, happy, neutral) and categor-isation goal (personality, physical) indeed yielded asignificant two-way interaction between expressionand categorisation goal, F(3, 13) = 5.48, p < .02,ηp2 = .26. In line with our expectations, multiplecomparisons of the response times to each expres-sion within each categorisation task revealed anegativity bias. Participants were significantlyslower to categorise a face as extraverted or intro-verted when it displayed an angry or disgustedexpression than when it displayed a happy orneutral expression (cf. Van Dillen et al., 2011).Moreover, and as predicted, there were no signi-ficant differences in response times to the differentface stimuli when participants categorised thesealong the dimension of light frequency, ps > .28(see Table 1). In addition, results revealed amarginal main effect of categorisation goal, F(1,16) = 4.14, p = .06, ηp2 = .21. Participants wereslower during the personality categorisation task(M = 801, SD = 68) than during the physicalcategorisation task (M = 758, SD = 45).

Table 1.Mean response times on the face-categorisation taskas a function of expression and categorisation task (standarddeviations within brackets)

Categorisation task

Expression Personality Physical

Angry 907 (210)a 735 (187)cDisgust 919 (195)a 758 (194)cHappy 823 (189)b 751 (190)cNeutral 839 (185)b 753 (210)c

Note: Response times are in milliseconds. Means that do not share

subscripts differ within rows and columns at p < .05.

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 7

Page 8: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

We observed no effects on response times of eyecolour or target gender (ps > .150).

EMG activity

We hypothesised that facial muscle reactivity woulddepend on the activated category, with facial musclereactivity during the psychologically meaningfulpersonality categorisation task (concerning extra-version), but no facial muscle responses wasobserved during the psychologically meaninglessphysical categorisation task (concerning light fre-quency). In addition, we implemented a controlcondition, which consisted of a passive viewing task,to verify that facial muscle reactivity occurred inresponse to our specific face stimuli. Hence, wepredicted that during this passive viewing task,participants would also display facial reactions tofacial expressions.

In short, we predicted facial muscle reactivityduring the personality categorisation task, andduring passive viewing, meaning greater levatorlabii activity in response to disgusted faces, greateractivity for the corrugator supercilii brow muscle inresponse to angry faces and greater activation ofthe zygomaticus major cheek muscle in responseto happy faces. During the physical categorisationtask, we, however, did not predict facial muscleactivity to differ as a function of emotionalexpression. To see whether categorisation goalsindeed moderate facial muscle reactivity, followingthe overall analyses, we examined the effects ofexpression on muscle activity separately for eachcategorisation goal. An overview of the timecourses of the facial muscle activity (i.e., levatorresponses to disgust faces, corrugator responses toangry faces and zygomaticus responses to happyfaces) across categorisation tasks is depicted inFigure 1. See Figure 2 for an overview of therelevant means and standard deviations acrossexpressions and categorisation tasks.

Overall analyses

To examine the moderating influence of categor-isation goal on facial muscle reactivity, we firstconducted a full-factorial GLM repeated mea‐sures analysis with the within-participants factors

muscle (levator, corrugator, zygomaticus), expres-sion (angry, disgusted, happy, neutral) and cate-gorisation goal (personality, physical, control) ofparticipants’ standardised EMG recordings. Thisanalysis yielded a marginal main effect of categor-isation goal, F(32, 2) = 3.18, p = .055, ηp2 = .18,with pairwise comparisons revealing significantlyreduced facial muscle activity during the physicalcategorisation task (M = −.016, SE = .01; p =.017) compared to the control condition (M =.013, SE = .01). Facial muscle activity between thecontrol condition and the personality categorisa-tion task (M =–.008, SE = .01) did not differsignificantly, p = .185. The full-factorial repeatedmeasures analysis, moreover, yielded significanttwo-way interactions between muscle and condi-tion, F(4, 64) = 2.62, p = .043, ηp2 = .14, andbetween muscle and expression, F(6, 96) = 3.35,p < .001, ηp2 = .17, which were further qualifiedby a significant three-way interaction betweenmuscle, expression and categorisation goal, F(12,192) = 1.88, p = .039, ηp2 = .11.

To understand this overall three-way interac-tion, we preceded by comparing facial muscleactivity during each of the categorisation blocksseparately with the control condition. Note that iffacial muscle activity would differ as a function ofthe to-be-compared conditions, this would againyield a three-way interaction between categorisa-tion goal, expression and muscle. This was indeedthe case when we compared responses during thephysical categorisation task and the personalitycategorisation task, F(6, 96) = 2.12, p = .058, ηp2 =.12. A similar three-way interaction emergedwhen we compared responses during the physicalcategorisation task and responses during the pass-ive viewing control condition, F(6, 96) = 2.52, p =.026, ηp2 = .14, suggesting that facial muscleactivity in response to the varying emotionalexpressions differed during the physical categor-isation task compared to both the personalitycategorisation task and the passive viewing controlcondition. Importantly, no three-way interactionwas observed when we compared EMG responsesduring the personality categorisation task withpassive viewing, F(6, 96) = 1.06, p = .395, ηp2 =.06. The latter analysis did, however, yield a

VAN DILLEN ET AL.

8 COGNITION AND EMOTION, 2014

Page 9: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

significant two-way interaction between expres-sion and muscle, F(6, 96) = 3.87, p = .002, ηp2 =.20, suggesting that muscle activity varied as a

function of expression, but that this pattern wassimilar during the personality categorisation taskand the passive viewing task.

Passive viewing

chan

ge

sco

rech

ang

e sc

ore

chan

ge

sco

re

Personality Physical

Passive viewing Personality Physical

Passive viewing Personality Physical

100 200 300 400 500 600 700 800 900 1000 1100 1200 1300 1400 1500 1600 1700 1800 1900 2000

100 200 300 400 500 600 700 800 900 1000 1100 1200 1300 1400 1500 1600 1700 1800 1900 2000

100

–0.03

–0.02

–0.01

0

0.01

0.02

0.03

0.04

–0.02

–0.01

0

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

200 300 400 500 600 700 800 900 1000 1100 1200 1300 1400 1500 1600 1700 1800 1900 2000

–0.06

–0.04

–0.02

0

0.02

0.04

0.06

0.08(a)

(b)

(c)

Figure 1. Time course of EMG activity during the 2000 ms stimulus presentation window for the levator labii muscle response to disgust

faces (row a), the corrugator supercilii muscle response to anger faces (row b) and zygomaticus major muscle response to happy faces (row c) as

a function of categorisation goal (passive viewing, personality, physical).

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 9

Page 10: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

(a)

(b)

(c)

–0.06

–0.04

–0.02

0.00

0.02

0.04

0.06

Passive viewing Personality Physical

chan

ge

sco

re

Anger Disgust Happy Neutral

–0.03

–0.02

–0.01

0

0.01

0.02

0.03

Passive viewing Personality Physical

Passive viewing Personality Physical

chan

ge

sco

rech

ang

e sc

ore

Anger Disgust Happy Neutral

–0.06

–0.08

–0.04

–0.02

0

0.02

0.04

0.06

0.08 Anger Disgust Happy Neutral

Figure 2. Standardised EMG scores for the levator labii (row a), corrugator supercilii (row b) and zygomatic major muscle (row c) according

to categorisation goal (passive viewing, personality, physical) and expression (anger, disgust, happiness, neutral). Error bars reflect standard

errors.

VAN DILLEN ET AL.

10 COGNITION AND EMOTION, 2014

Page 11: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

Thus, the results of the above-mentionedanalyses provide evidence that facial EMG activityin response to the emotional expressions wasindeed modulated by the psychological relevanceof the categorisation goal, with qualitatively dif-ferent effects of target expression on facial muscleactivity during the physical categorisation taskcompared to both the personality categorisationtask as well as the passive viewing control condi-tion, but no differentiation between the latter twoconditions. To obtain a better understanding ofhow exactly the categorisation goal moderatedfacial muscle activity, we next analysed the effectsof target expression and muscle site within eachcategorisation task.

Passive viewing

To establish that our target faces spontaneouslyelicited facial muscle activity during the passiveviewing control condition, we first conducted a 3(muscle) × 4 (expression) GLM repeated measuresanalysis of variance of participants’ EMG recordings.This yielded a significant interaction of expressionand muscle, F(6, 96) = 2.63, p = .021, ηp2 = .14.

Pairwise comparisons of the effects of expres-sion for each muscle separately revealed thatlevator labii activity in response to disgust expres-sions was marginally greater than to both happyexpressions, t(16) = 1.91, p = .070, and neutralexpressions, t(16) = 1.99, p = .061. Angry expres-sions, moreover, elicited significantly strongercorrugator responses than happy expressions,t(16) = 2.10, p = .050. For the zygomaticusmuscle, happy expressions elicited greaterresponses than both angry expressions, t(16) =2.14, p = .048, and neutral expressions, t(16) =2.34, p = .035. Thus, in line with previous findings(e.g., Dimberg, 1982), passive viewing of facialexpressions of emotions indeed appeared to spon-taneously elicit congruent facial muscle activity.

Personality categorisation task

To examine facial muscle activity during thepersonality categorisation task, we first conducteda GLM repeated measures analysis of participants’EMG recordings with muscle and expression as

factors. As for the passive viewing condition, thisanalysis yielded a significant interaction effect ofexpression and muscle, F(6, 96) = 4.80, p = .040,ηp2 = .13. To understand this interaction, we nextconducted pairwise comparisons of the effects ofexpression for each muscle separately.

Levator labii activity was greatest in response todisgust expressions and was significantly greater inresponse to disgust expressions than in response tohappy expressions, t(16) = 2.68, p = .017, andneutral expressions, t(16) = 3.78, p = .002, andmarginally greater than in response to angryexpressions, t(16) = 1.86, p = .085. Angry expres-sions moreover elicited the strongest response incorrugator supercilii muscle, and activity inresponse to angry expressions was significantlygreater than in response to happy expressions,t(16) = 2.61, p = .019. Finally, zygomaticus majormuscle activity was significantly greater inresponse to happy expressions than in responseto angry expressions, t(16) = 2.58, p = .020. Notethough, that this difference was primarily drivenby reduced muscle activity compared to baseline inresponse to angry faces (see Cannon et al., 2009,for a similar pattern of results). Pairwise compar-isons revealed no other significant differences.

Together, the above findings thus provide evid-ence for facial muscle reactivity to facial expressionsduring the personality categorisation task compar-able to the facial muscle activity observed during thepassive viewing task, especially for activity in thelevator and corrugator muscle.

Physical categorisation task

As predicted, a GLM repeated measures analysisof participants’ EMG recordings with muscle andexpression as factors revealed no significant mainor interaction effects on facial EMG activityduring the physical categorisation task (ps > .20),which suggests that no facial muscle reactivityoccurred. Moreover, simple contrast analyses,which are not required in the absence of anyhigher order effects, did not yield any significantdifferences in facial muscle activity as a function oftarget expression during the physical categorisationtask (ps > .20).

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 11

Page 12: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

Comparisons of facial muscle activity acrosscategorisation tasks

Another way to examine the effects of categorisa-tion goals on facial reactions to facial expressions isby comparing specific muscle responses betweencategorisation goals. This was done in an addi-tional set of t-tests.

These comparisons revealed that levator labiiactivity in response to disgust expressions wasreduced during the physical categorisation taskcompared to the personality categorisation task,t(16) = 2.99, p = .009, and was marginally reducedcompared to the passive viewing condition, t(16) =1.92, p = .072, whereas activity during the lattertwo conditions did not differ.

Similarly, corrugator supercilii activity in responseto angry expressions was significantly reduced duringthe physical categorisation task compared to thepersonality categorisation task, t(16) = 2.57, p = .020,and was marginally reduced compared to the passiveviewing condition, t(16) = 1.82, p = .085, whereas,again, no significant difference was found betweenthe latter two conditions.

Paired t-tests of zygomaticus major activity inresponse to happy expressions across the differentconditions revealed enhanced activity during thepassive viewing condition compared to both thephysical categorisation task, t(16) = 2.21, p = .042,and the personality categorisation task, t(16) =1.85, p = .082, whereas the latter two conditionsdid not differ.

Thus, these findings provide further evidencethat categorisation goals modulated facial muscleresponses to facial expressions, especially to dis-gusted and angry target expressions, whereas zygo-maticus activity in response to happy faces wasenhanced only in the passive viewing condition.

Exploratory analyses

Relationship between categorisation responses andfacial EMG

We reasoned that if people’s facial reactions reflectfacial reactivity, and if the slowing of responsetimes to anger and disgust faces is driven bydistraction by negative emotional cues (i.e., reflects

a negativity bias), there should exist also a positiverelation between facial muscle responses to facialexpressions (e.g., corrugator activity to angry faces,and levator activity to disgust faces) and responsetimes to these faces during the personality cate-gorisation task but not during the physical cate-gorisation task. To examine whether facialreactivity predicted performance on the categor-isation task, we conducted multiple regressionanalyses on response times with muscle activity(levator labii, corrugator supercilii, zygomaticusmajor) as predictors for each categorisation taskand expression. During the personality categorisa-tion task, levator labii activity predicted responsetimes to disgust expressions when participantscategorised these faces on the basis of personality,β = .51, t(16) = 2.12, p < .05. Similarly, greatercorrugator supercilii activity indeed predictedresponse times to angry expressions, β = .60,t(16) = 2.82, p < .02. Hence, the greater theparticipants’ muscle activity in response to theanger and disgust faces, the slower their responsesto these faces (i.e., the greater their negativity bias)during the personality categorisation task. Therelationship between zygomaticus activity andresponse times to happy expressions, however,was non-significant, and even slightly reversed,β = –.42, t(16) = 1.81, p = .10.

Importantly, we observed no other significantrelations between facial reactivity and the negativ-ity bias (e.g., between muscle activity in responseto disgust faces and response times to angry faces).It thus seems that the relation between facialmuscle responses and reaction times on thecategorisation task was emotion specific. Addi-tionally, we observed no significant relationsbetween facial muscle activity and response timesduring the physical categorisation task (ps > .75),which further supports our notion that participantsdid not process the emotional relevance of thefacial stimuli during this psychologically meaning-less categorisation task.

Possible effects of task duration

Note that we observed a marginal main effect ofcategorisation goal on participants’ response times.

VAN DILLEN ET AL.

12 COGNITION AND EMOTION, 2014

Page 13: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

Although we did not observe any interactions withtime as a factor in our initial analyses, andparticipants always viewed the target faces fortwo seconds, regardless of their response latencies,one possibility is that because participants tooklonger to respond during the personality categor-isation task than the physical categorisation task,participants were more likely to incorporate emo-tional information in their decisions during thistask. Hence, to rule out that any of the effects weobserved could have been due to differences intask duration, we repeated all above analyses withaverage response time differences to the two tasks(meanRTpersonality − meanRTphysical) as a covari-ate. These analyses did not reveal any additionaleffects.

DISCUSSION

The present research tested the notion thatpeople’s facial muscle reactivity to the emotionsexpressed by others is subject to contextual con-straints. Specifically, emotional cues such as angry,disgusted and happy faces communicate crucialsocially and psychologically meaningful informa-tion for adaptive interpersonal behaviour and thusplay a central role in social cognition. Wepredicted that people would differentially respondto these emotional facial features when they areperceived as psychologically meaningful. We,however, predicted people to be unresponsive tothe emotional expression of target faces when theywould categorise these faces along a dimension forwhich no psychological assessment needed tobe made.

In an experimental set-up adopted from VanDillen et al. (2011), participants learned to asso-ciate eye colour with a personality category (extra-version) versus a physical category (lightfrequency) and used these associations in a sub-sequent speeded categorisation task of blue-eyedand brown-eyed faces displaying various emotionalexpressions. Importantly, in both the categorisa-tion tasks, participants based their responses onthe same facial feature, namely eye colour. Thisfeature, however, communicated psychologically

meaningful information about how the targetmight behave (whether the target was more likelyto be extraverted versus introverted) or merelyindicated another physical property (whether thetarget’s irises absorbed long- or short-frequencylight). Thus, whereas the visual information dis-played was identical in both cases, the psycholo-gical meaning of this information differed. Usingbehavioural measures (response times) and facialEMG), our analyses revealed that categorisationtasks moderate participants’ responsiveness tofacial expressions of emotion.

As observed in previous research (Van Dillenet al., 2011), anger and disgust expressions sloweddown participants’ categorisation latencies com-pared to neutral or happy expressions on thepersonality categorisation task, but not on thephysical categorisation task, providing furtherevidence for the context-sensitivity of the negativ-ity bias. The current research, however, extendedthese findings in important ways, by looking atemotion-specific responses more directly, usingfacial EMG. These facial muscle activity record-ings revealed that participants responded to anger,disgust and, to a lesser extent, happy expressionswith facial muscle activity (i.e., increased corrug-ator activity to angry faces, decreased corrugatoractivity to happy faces and increased levatoractivity to disgust faces) during both the passiveviewing task as well as when implementing thepersonality category, whereas facial muscleresponses to these facial expressions were comple-tely absent during the light frequency categorisa-tion task.

Importantly, in our current experiment weincluded as emotional target stimuli disgustedand happy faces in addition to angry faces. Thisallowed us to examine whether the previouslyreported effects of categorisation goals on thenegativity bias (Van Dillen et al., 2011) are drivenby emotion-specific responses to facial expressions(as reflected by facial muscle reactivity) or greaterelaboration (in which case, we should haveobserved enhanced corrugator supercilii activityduring both categorisation tasks compared to thepassive viewing condition, and regardless of emo-tional expression; see Yartz & Hawk, 2002, for a

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 13

Page 14: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

similar approach). The findings seem to underlinethe first explanation, as we observed greater levatorlabii activity specifically in response to disgustfaces during the personality categorisation task,and greater corrugator supercilii activity especiallyin response to angry faces. Moreover, facial muscleactivity was comparable between the personalitycategorisation task and passive viewing condition,whereas facial muscle activity was generallyreduced during the physical categorisation taskproviding further evidence that social cognitionwas less engaged when participants implemented apsychologically meaningless categorisation goal.The results of exploratory analyses, finally, pointedto a relationship between the magnitude ofpeople’s facial muscle activity and response timesto disgust faces and angry faces specifically, whichfurther suggests that participants’ response laten-cies indeed reflected differential social-cognitiveprocessing of the emotional relevance of the targetfaces.

Note though that whereas we found evidencefor categorisation goals to modulate activity ofboth the levator labii muscle and the corrugatorsupercilii muscle to disgust and anger expressions,respectively, findings were less straightforward foractivity of the zygomaticus major muscle.Although the same pattern of activity emergedfor zygomaticus major muscle activity in responseto the various target faces during the personalitycategorisation task as during the passive viewingtask (i.e., relatively greater responses to happyfaces compared to angry faces), activity wasgenerally suppressed during the personality cate-gorisation task. These findings parallel those ofCannon and colleagues (2009), who comparedzygomaticus responses to angry and happy faceswhile participants named the target’s expression orthe target’s colour. As in our passive viewingcondition, the authors observed enhanced zygo-maticus activity to happy, compared to angry,faces. Comparable to the pattern observed duringour personality categorisation task, zygomaticusactivity was still significantly greater in response tohappy compared to angry faces during the colour-naming task, but this time, the effect was drivenby reduced activity compared to baseline in

response to the angry faces and not by enhancedactivity in response to happy faces. In addition,recent findings (Topolinski, Likowski, Wyers, &Strack, 2009) suggest that zygomaticus activitymay be suppressed when the to-be-processedinformation is less fluent (which was probablythe case during both our categorisation taskscompared to the passive viewing condition). Thisexplanation is, however, less compatible with theselective enhancement of corrugator superciliiactivity that we observed in response to angryfaces during both the personality categorisationtask and the passive viewing condition, but notduring the physical categorisation task. Clearly,more research is needed to examine the specificeffects of varying task parameters on zygomaticusmajor activity in response to emotional targetexpressions.

Facial muscle activity is often considered toreflect an automatic response to facial displays ofemotion (e.g., increase in corrugator to angry faces,decrease in corrugator to happy faces, increase inzygomaticus to happy faces). Indeed, previousresearch has demonstrated convincingly that evensubliminal presentation of emotional faces maytrigger congruent muscle reactivity, suggesting thatthese effects may occur quickly and unintentionally(Dimberg, Thunberg, & Elmehed, 2000; Dimberg,Thunberg, & Grunedal, 2002; Rotteveel, de Groot,Geutskens, & Phaf, 2001). In line with these earlierfindings, we also observed that the passive viewingof target faces of varying emotional expressionsspontaneously elicited facial muscle activity. Ourfindings thus support the notion that, by default,people engage social cognition when processingsocial-emotional cues from their environment(Pickett & Gardner, 2005; Pickett et al., 2004;Schilbach et al., 2012).

However, despite their automatic nature, wealso found evidence that facial muscle responsesare still sensitive to contextual aspects. This is inline with attention research on the interplay ofbottom-up and top-down attention (i.e., Dreis-bach & Haider, 2009; Folk, Remington, &Johnston, 1992; Knudsen, 2007; Koivisto &Revonsuo, 2007). Bottom-up attention involvesinformation selection on the basis of salient

VAN DILLEN ET AL.

14 COGNITION AND EMOTION, 2014

Page 15: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

stimulus features (i.e., novelty, predictability andthreat) that are likely to be important for adaptivebehaviour (Bradley, 2009), whereas top-downattention involves selective processing of informa-tion relevant for the task at hand (Corbetta &Shulman, 2002; Knudsen, 2007; Lavie & DeFockert, 2005). Increasing evidence suggests thattop-down attentional processes modulate moreautomatic bottom-up attention to emotional cues(see for a review, Rauss, Schwartz, & Pourtois,2011; Zhou & Liu, 2013). Accordingly, peopleshould respond to emotional faces especially in theabsence of a specific task, as during our passiveviewing condition, or when the activated categoryrepresentation prompts attentional processing offacial expressions, as during our personality cate-gorisation task. When the task, however, directsattention in such a way that emotional featuresneed not be processed anymore, these automaticemotion effects should not occur, which is whatwe observed in the light frequency categorisa-tion task.

Along similar lines, our findings contribute toa growing literature on the contextual influenceson facial reactions to facial expressions (Cannonet al., 2009; Hess et al., 1998; see Hess &Fischer, 2013, for a review). From these findings,it becomes clear that facial reactivity oftentimesserves social functions, such that any factor thataddresses the nature of the task to be more or lesssocial should influence the extent that it occurs.Indeed, not only motivational factors such asattitude towards the target (Likowski et al.,2008), the association of facial targets withmonetary rewards (Sims, Van Reekum, John-stone, & Chakrabarti, 2012), group membershipof the target (Bourgeois & Hess, 2008), but alsowhether people explicitly name the emotionalexpression of target faces (e.g., happy, angry) ortheir—arbitrary—colour (e.g., blue, green; Can-non et al., 2009), all modulate the extent towhich facial reactivity occurs. Our findings add tothis literature by showing that even when peoplebase their categorisations on the exact samerealistic feature of a target face (i.e., eye colour),their facial muscle reactions to facial displays ofemotion still vary depending on whether this

feature is linked to a psychologically meaningfulor mere physical dimension.

Whereas participants in our study displayedfacial reactivity during the personality categorisa-tion task, as evidenced by EMG activity, they didnot appear to make specific associations betweencertain expressions and extraversion or introver-sion. Note though that in our experiment (1) wehave explicitly linked extraversion to eye colourand (2) we have carefully balanced the expressionsacross the different categories. It is well possiblethat in the absence of such a concrete decisionrule, participants will use facial expressions as a cuefor categorising personality. An important ques-tion for future research, therefore, is what wouldhappen if participants were to categorise person-ality on the basis of eye colour but the frequencieswith which a certain emotional expression isassociated with a certain personality trait wouldnot be equally distributed across blue-eyed andbrown-eyed targets. Perhaps, under these circum-stances, participants learn to associate certainpersonality traits with certain expressions (e.g.,angry with extraversion) and will base their futurecategorisation decisions on the presence of theseexpressions, especially when more reliable cues arenot (longer) available (e.g., when eye colour cannotbe determined).

To examine our hypotheses, in the currentstudy, we used as targets static face stimuli.Whereas our decision to do so was based onexperimental constraints, several findings demon-strate that using dynamic compared to static facesaffects facial reactivity to these faces. Dynamiccompared to static faces for example result inbetter recognition rates, higher intensity andrealism ratings and trigger stronger expression-specific EMG responsivity (Weyers, Mühlberger,Hefele, & Pauli, 2006). Moreover, it can beargued that dynamic faces more closely resembleface stimuli in naturalistic settings, and their usemay therefore be more ecologically valid (Hess &Blairy, 2001; Kilts, Egan, Gideon, Ely, & Hoff-man, 2003). If categorisation goals moderate facialreactions to dynamic faces comparable to staticfaces, this provides even stronger evidence for the

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 15

Page 16: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

contextual nature of processing facial expressionsof emotion.

We should note that although we argue for aso-called ‘attentional moderation’ account ofemotional face processing, we do not want tomake the claim that under certain conditions,facial expressions of emotions are not processed at

all. Providing such evidence is difficult, and alldepends on one’s operationalisation of emotionprocessing, the power of one’s facial stimuli, andthe kind of measurement one implements. Arecent study (Zhou & Liu, 2013), for example,examined whether attention modulated proces-sing of emotion using a stimulus–response com-patibility (SRC) paradigm. During the task, thetarget face was flanked by two distractor faces,which were similar or different in the dimensionparticipants were instructed to categorise thetarget face on (task relevant), or in an unrelateddimension (task irrelevant). By comparingresponse time differences to targets flanked bysimilar versus dissimilar distractors, SRC effectswere computed, reflecting the degree of conflict.This was done for the categorisation dimensionsof colour, gender and emotion of the target faces,and as a function of task relevance. In two suchflanker experiments, SRC effects were largerwhen the conflicts were task relevant (i.e., whenthe categorisation dimension was gender and thedistractors differed in gender from the target face)than when they were task irrelevant (i.e., when thecategorisation dimension was gender and thedistractors differed in emotional expression fromthe target face), and this was the case for all threedimensions. Thus, in line with our current find-ings, conflict processing of emotion was modu-lated by task relevance. However, task modulationof the colour SRC effect was still significantlygreater than that of the gender or emotion SRCeffect indicating that processing of salient socialand/or emotional information was modulated byattention to a lesser degree than processingof non-emotional information. These findingsthus seem to suggest that emotion processingcan be influenced by attentional control, but at thesame time salience of social and/or emotional

information may still bias towards bottom-upprocessing, at least under conditions of conflict.Thus, combining different methods and experi-mental designs is needed to gain more insight intowhen and how top-down and bottom-up factorsinfluence to what extent emotional informationis processed as a function of one’s categorisa-tion goal.

Although measuring facial reactivity, thus, doesnot allow us to assess emotion processing directly,it does allow us to establish whether a sociallymeaningful differentiation between emotionalexpressions has occurred. If facial reactivity merelyreflects copying of another person’s actions withno accompanying inferences about the otherperson’s mental states (social cognition), modera-tion need not occur, and people would mimicemotional faces under all circumstances. If thiswere the case, one would have probably notobserved any effects of categorisation goal on facialreactivity or a relation between facial EMGactivity and the response latencies on the categor-isation task only during the social categorisationtask. Given that we have observed modulation offacial muscle responses to facial expressions, aswell as converging evidence from our facial EMGand response latency measures, we therefore thinkthat in our study, facial reactivity can be inter-preted as a sign of selective processing of facialexpressions of emotions explaining the observednegativity bias. This is further supported by thecognitive psychology literature that explores faceprocessing; faces are processed globally and hol-istically (Kimchi, 1992). Perhaps our physical(non-social) condition promotes local, instead ofthe global processing witnessed by default in thepassive viewing and personality (social) conditionand task. Future eye-tracking studies could furtherresolve this debate but are beyond the focus of thecurrent manuscript. Thus, it is not our intention tosettle a long running debate here. Rather, in linewith a growing literature, we would like to arguefor a moderation account, in that under someconditions, such as our physical categorisationtask, facial expressions of emotion may be pro-cessed to a lesser extent.

VAN DILLEN ET AL.

16 COGNITION AND EMOTION, 2014

Page 17: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

CONCLUSION

By definition, any encounter between people

occurs in a social context, and this context

determines which information is extracted from

the environment. Basing ourselves on theories of

social perception, we show in the present research

that when the activated categorisation goal is

socially and psychologically meaningful (i.e.,

allows for inferences about another person’s inten-tions, thoughts and feelings), people will pick up

on the (negative) facial displays of emotions.

When there is no psychologically meaningful

relation between the activated categorisation goal

and the emotional cues present, however, people

will discard emotional cues just like any other

task-irrelevant information and show less or no

facial muscle reactivity. The present research thus

provides further evidence for the goal-dependence

of selective responses to the (negative) emotions

expressed by others.

Manuscript received 25 August 2014

Revised manuscript received 25 October 2014

Manuscript accepted 28 October 2014

First published online 01 December 2014

REFERENCES

Balconi, M., Bortolotti, A., & Gonzaga, L. (2011).Emotional face recognition, EMG response, andmedial prefrontal activity in empathic behaviour.Neuroscience Research, 71, 251–259.

Bartholow, B. D. (2010). On the role of conflict andcontrol in social cognition: Event-related brainpotential investigations, Psychophysiology, 47, 201–212. doi:10.1111/j.1469-8986.2009.00955.x

Becker, D. V., Kenrick, D. T., Neuberg, S. L., Black-well, K. C., & Smith, D. M. (2007). The confoun-ded nature of angry men and happy women. Journalof Personality and Social Psychology, 92, 179–190.doi:10.1037/0022-3514.92.2.179

Bourgeois, P., & Hess, U. (2008). The impact of socialcontext on mimicry. Biological Psychology, 77, 343–352. doi:10.1016/j.biopsycho.2007.11.008

Bradley, M. M. (2009). Natural selective attention:Orienting and emotion. Psychophysiology, 46, 1–11.

Burklund, L. J., Eisenberger, N. I., & Lieberman, M.D. (2007). The face of rejection: Rejection sensitivitymoderates dorsal anterior cingulate activity to disap-proving facial expressions. Social Neuroscience, 2, 238–253. doi:10.1080/17470910701391711

Cacioppo, J. T., & Petty, R. E. (1981). Electromyo-graphic specificity during covert information proces-sing. Psychophysiology, 18, 518–523. doi:10.1111/j.1469-8986.1981.tb01819.x

Cacioppo, J. T., & Petty, R. R. (1979). Attitudes andcognitive response: An electrophysical approach.Journal of Personality and Social Psychology, 37,2181–2199. doi:10.1037/0022-3514.37.12.2181

Cannon, P. R., Hayes, A. E., & Tipper, S. P. (2009). Anelectromyographic investigation of the impact of taskrelevance on facial mimicry. Cognition and Emotion,23, 918–929. doi:10.1080/02699930802234864

Corbetta, M., & Shulman, G. L. (2002). Control ofgoal-directed and stimulus-driven attention in thebrain. Nature Reviews Neuroscience, 3, 201–215.

Decety, J., & Chaminade, T. (2003). When the selfrepresents the other: A new cognitive neuroscienceview on psychological identification. Consciousness

and Cognition, 12, 577–596. doi:10.1016/S1053-8100(03)00076-X

de Sousa, A., McDonald, S., Rushby, J., Li, S., &Dimoska, A. & James, C. (2011). Understandingdeficits in empathy after traumatic brain injury: Therole of affective responsivity. Cortex, 47, 526–535.

Dimberg, U. (1982). Facial reactions to facial expres-sions. Psychophysiology, 19, 643–647. doi:10.1111/j.1469-8986.1982.tb02516.x

Dimberg, U. (1996). Facial EMG and the orientingresponse. Psychophysiology, 33, 34.

Dimberg, U., Andréasson, P., & Thunberg, M. (2011).Emotional empathy and facial reactions to facialexpressions. Journal of Psychophysiology, 25, 26–31.

Dimberg, U., & Petterson, M. (2000). Facial reactionsto happy and angry facial expressions: Evidence forright hemisphere dominance. Psychophysiology, 37,693–696. doi:10.1111/1469-8986.3750693

Dimberg, U., & Thunberg, M. (1998). Rapid facialreactions to emotional facial expressions. ScandinavianJournal of Psychology, 39, 39–45. doi:10.1111/1467-9450.00054

Dimberg, U., Thunberg, M., & Elmehed, K. (2000).Unconscious facial reactions to emotional facial expres-sions. Psychological Science, 11, 86–89. doi:10.1111/1467-9280.00221

Dimberg, U., Thunberg, M., & Grunedal, S. (2002).Facial reactions to emotional stimuli: Automatically

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 17

Page 18: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

controlled emotional responses. Cognition and Emo-

tion, 16, 449–471. doi:10.1080/02699930143000356Dreisbach, G., & Haider, H. (2009). How task

representations guide attention: Further evidencefor the shielding function of task sets. Journal of

Experimental Psychology-Learning Memory and Cog-

nition, 35, 477–486. doi:10.1037/a0014647Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A.

(2007). G*Power 3: A flexible statistical poweranalysis program for the social, behavioral, andbiomedical sciences. Behavior Research Methods, 39,175–191. doi:10.3758/BF03193146

Folk, C. L., Remington, R. W., & Johnston, J. C.(1992). Involuntary covert orienting is contingent onattentional control settings. Journal of ExperimentalPsychology-Human Perception and Performance, 18,1030–1044. doi:10.1037/0096-1523.18.4.1030

Fridlund, A. J., & Cacioppo, J. T. (1986). Guidelinesfor human electromyographic research. Psychophysi-ology, 23, 567–589. doi:10.1111/j.1469-8986.1986.tb00676.x

Fridlund, A. J., & Izard, C. E. (1983). Electromyo-graphic studies of facial expressions of emotions andpatterns of emotions. In J. T. Cacioppo & R. E.Petty (Eds.), Social psychophysiology: A sourcebook

(pp. 248–286). New York, NY: Guilford Press.Goeleven, E., De Raedt, R., Leyman, L., & Verschuere,

B. (2008). The karolinska directed emotional faces:A validation study. Cognition and Emotion, 22,1094–1118. doi:10.1080/02699930701626582

Harrison, N., Morgan, R., & Critchley, H. (2010).From facial mimicry to emotional empathy: A rolefor norepinephrine?. Social Neuroscience, 5, 393–400.

Heller, A. S., Greischar, L. L., Honor, A., Anderle, M.J., & Davidson, R. J. (2011). Simultaneous acquisi-tion of corrugator electromyography and functionalmagnetic resonance imaging: A new method forobjectively measuring affect and neural activity con-currently. NeuroImage, 58, 930–934.

Hess, U., & Blairy, S. (2001). Facial mimicry andemotional contagion to dynamic emotional facialexpressions and their influence on decoding accuracy.International Journal of Psychophysiology, 40, 129–141. doi:10.1016/S0167-8760(00)00161-6

Hess, U., & Fischer, A. (2013). Emotional mimicry associal regulation. Personality and Social Psychology

Review, 17, 142–157. doi:10.1177/1088868312472607

Hess, U., Philippot, P., & Blairy, S. (1998). Facialreactions to emotional facial expressions: Affect or

cognition? Cognition and Emotion, 12, 509–531.doi:10.1080/026999398379547

Hess, U., Sabourin, G., & Kleck, R. E. (2007). Post-auricular and eyeblink startle responses to facial expres-sions. Psychophysiology, 44, 431–435. doi:10.1111/j.1469-8986.2007.00516.x

Hugenberg, K., & Bodenhausen, G. V. (2004). Cat-egory membership moderates the inhibition of socialidentities. Journal of Experimental Social Psychology,40, 233–238. doi:10.1016/S0022-1031(03)00096-9

Kilts, C. D., Egan, G., Gideon, D. A., Ely, T. D., &Hoffman, J. M. (2003). Dissociable neural pathwaysare involved in the recognition of emotion in staticand dynamic facial expressions. NeuroImage, 18,156–168. doi:10.1006/nimg.2002.1323

Kimchi, R. (1992). Primacy of wholistic processing andglobal/local paradigm: A critical review. PsychologicalBulletin, 112, 24–38. doi:10.1037/0033-2909.112.1.24

Knudsen, E. I. (2007). Fundamental components ofattention. Annual Review of Neuroscience, 30, 57–78.doi:10.1146/annurev.neuro.30.051606.094256

Knutson, B. (1996). Facial expressions of emotion influ-ence interpersonal trait inferences. Journal of NonverbalBehavior, 20, 165–182. doi:10.1007/BF02281954

Koivisto, M., & Revonsuo, A. (2007). How meaningshapes seeing. Psychological Science, 18, 845–849.doi:10.1111/j.1467-9280.2007.01989.x

Lavie, N., & De Fockert, J. (2005). The role of workingmemory in attentional capture. Psychonomic Bulletin& Review, 12, 669–674. doi:10.3758/BF03196756

Likowski, K. U., Mühlberger, A., Seibt, B., Pauli, P., &Weyers, P. (2008). Modulation of facial mimicry byattitudes. Journal of Experimental Social Psychology,44, 1065–1072. doi:10.1016/j.jesp.2007.10.007

Lundqvist, D., Flykt, A., & Öhman, A. (1998). TheKarolinska Directed Emotional Faces (KDEF). Stock-holm: Department of Neurosciences, KarolinskaHospital.

Moriya, J., Koster, E. H. W., & De Raedt, R. (inpress). The influence of working memory on theanger superiority effect. Cognition and Emotion.

Niedenthal, P. M., Winkielman, P., Mondillon, L., &Vermeulen, N. (2009). Embodied emotion concepts.Journal of Personality and Social Psychology, 96, 1120–1136. doi:10.1037/a0015574

Oosterhof, N., & Todorov, A. (2008). The functionalbasis of face evaluation. Proceedings of the National

Academy of Sciences, 105, 11087–11092. doi:10.1073/pnas.0805664105

VAN DILLEN ET AL.

18 COGNITION AND EMOTION, 2014

Page 19: Looking with different eyes: The psychological meaning of categorisation goals moderates facial reactivity to facial expressions

Pickett, C. L., & Gardner, W. L. (2005). The socialmonitoring system: Enhanced sensitivity to social cuesas an adaptive response to social exclusion. InK. Williams, J. Forgas, & W. von Hippel (Eds.), Thesocial outcast: Ostracism, social exclusion, rejection, and

bullying (pp. 213–226). New York, NY: PsychologyPress.

Pickett, C. L., Gardner, W. L., & Knowles, M. (2004).Getting a cue: The need to belong and enhancedsensitivity to social cues. Personality and Social

Psychology Bulletin, 30, 1095–1107. doi:10.1177/0146167203262085

Rauss, K., Schwartz, S., & Pourtois, G. (2011). Top-down effects on early visual processing in humans: Apredictive coding framework. Neuroscience and Biobe-

havioral Reviews, 35, 1237–1253. doi:10.1016/j.neubiorev.2010.12.011

Rotteveel, M., de Groot, P., Geutskens, A., & Phaf, R.H. (2001). Stronger suboptimal than optimal affect-ive priming? Emotion, 1, 348–364. doi:10.1037/1528-3542.1.4.348

Schilbach, L., Bzdok, D., Timmermans, B., Fox, P. T.,Laird, A. R., Vogeley, K., & Eickhoff, S. B. (2012).Introspective minds: Using ALE meta-analyses toinvestigate commonalities in the neural correlates ofsocio-emotional processing and unconstrained cogni-tion. PLoS ONE, 7, e30920. doi:10.1371/journal.pone.0030920.t006

Schilbach, L., Eickhoff, S. B., Mojzisch, A., & Vogeley,K. (2008). What’s in a smile? Neural correlates of facialembodiment during social interaction. Social Neu-

roscience, 3, 37–50. doi:10.1080/17470910701563228Sims, T. B., Van Reekum, C. M., Johnstone, T., &

Chakrabarti, B. (2012). How reward modulates mim-icry: EMG evidence of greater automatic facial mim-icry of more rewarding faces. Psychophysiology, 49,998–1004. doi:10.1111/j.1469-8986.2012.01377.x

Tassinary, L. G., Cacioppo, J. T., & Vanman, E. J.(2007). The skeletomotor system: Surface electro-myography. In J. T. Cacioppo & L. G. Tassinary(Eds.), Handbook of psychophysiology (pp. 267–302).New York, NY: Cambridge University Press.

Thornhill, R., & Gangestad, S. W. (1999). Facialattractiveness. Trends in Cognitive Sciences, 12, 452–460. doi:10.1016/S1364-6613(99)01403-5

Topolinski, S., Likowski, K. U., Wyers, P., & Strack, F.(2009). The face of fluency: Semantic coherenceautomatically elicits a specific pattern of facial musclereactions. Cognition and Emotion, 23, 260–271.doi:10.1080/02699930801994112

Van Dillen, L. F., & Derks, B. (2012). Workingmemory load reduces facilitated processing of threa-tening faces: An ERP study. Emotion, 12, 1340–1349. doi:10.1037/a0028624

Van Dillen, L. F., & Koole, S. L. (2009). Howautomatic is “automatic vigilance”? The role ofworking memory in attentional interference of neg-ative information. Cognition and Emotion, 23, 1106–1117. doi:10.1080/02699930802338178

Van Dillen, L. F., Lakens, D., & Van den Bos, K.(2011). At face value: Categorization goals modulatevigilance for angry faces. Journal of Experimental

Social Psychology, 47, 235–240. doi:10.1016/j.jesp.2010.10.002

Van Honk, J., Tuiten, A., De Haan, E., Van den Hout,M., & Stam, H. (2001). Attentional biases for angryfaces: Relationships to trait anger and anxiety.Cognition and Emotion, 15, 279–297. doi:10.1080/0269993004200222

Vrana, S. R. (1993). The psychophysiology of disgust:Differentiating negative emotional context with facialEMG. Psychophysiology, 30, 279–286. doi:10.1111/j.1469-8986.1993.tb03354.x

Wessa, M., Heissler, J., Schönfelder, S., & Kanske, P.(2013). Goal-directed behavior under emotionaldistraction is preserved by enhanced task-specificactivation. Social Cognitive Affective Neuroscience, 8,305–312. doi:10.1093/scan/nsr098

Weyers, P., Muhlberger, A., Hefele, C., & Pauli, P.(2006). Electromyographic responses to static anddynamic avatar emotional facial expressions. Psycho-physiology, 43, 450–453. doi:10.1111/j.1469-8986.2006.00451.x

Yartz, A. R., & Hawk, L. W. (2002). Addressing thespecificity of affective startle modulation: Fear versusdisgust. Biological Psychology, 59, 55–68. doi:10.1016/S0301-0511(01)00121-1

Zhou, P., & Liu, X. (2013). Attentional modulation ofemotional conflict processing with Flanker tasks.PLoS ONE, 8, e60548.

CATEGORISING EMOTIONAL FACES

COGNITION AND EMOTION, 2014 19