18
This article was downloaded by: [University of Tennessee, Knoxville] On: 19 December 2014, At: 19:39 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Click for updates Cognition and Emotion Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/pcem20 Featural processing in recognition of emotional facial expressions Olivia Beaudry a , Annie Roy-Charland a , Melanie Perron a , Isabelle Cormier b & Roxane Tapp a a Department of Psychology, Laurentian University, Sudbury, Ontario, Canada b Ecole de Psychologie, Université de Moncton, Moncton, New Brunswick, Canada Published online: 18 Sep 2013. To cite this article: Olivia Beaudry, Annie Roy-Charland, Melanie Perron, Isabelle Cormier & Roxane Tapp (2014) Featural processing in recognition of emotional facial expressions, Cognition and Emotion, 28:3, 416-432, DOI: 10.1080/02699931.2013.833500 To link to this article: http://dx.doi.org/10.1080/02699931.2013.833500 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Featural processing in recognition of emotional facial expressions

  • Upload
    roxane

  • View
    214

  • Download
    0

Embed Size (px)

Citation preview

This article was downloaded by: [University of Tennessee, Knoxville]On: 19 December 2014, At: 19:39Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office:Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Click for updates

Cognition and EmotionPublication details, including instructions for authors and subscriptioninformation:http://www.tandfonline.com/loi/pcem20

Featural processing in recognition ofemotional facial expressionsOlivia Beaudrya, Annie Roy-Charlanda, Melanie Perrona, Isabelle Cormierb &Roxane Tappa

a Department of Psychology, Laurentian University, Sudbury, Ontario,Canadab Ecole de Psychologie, Université de Moncton, Moncton, New Brunswick,CanadaPublished online: 18 Sep 2013.

To cite this article: Olivia Beaudry, Annie Roy-Charland, Melanie Perron, Isabelle Cormier & Roxane Tapp(2014) Featural processing in recognition of emotional facial expressions, Cognition and Emotion, 28:3,416-432, DOI: 10.1080/02699931.2013.833500

To link to this article: http://dx.doi.org/10.1080/02699931.2013.833500

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”)contained in the publications on our platform. However, Taylor & Francis, our agents, and ourlicensors make no representations or warranties whatsoever as to the accuracy, completeness, orsuitability for any purpose of the Content. Any opinions and views expressed in this publicationare the opinions and views of the authors, and are not the views of or endorsed by Taylor &Francis. The accuracy of the Content should not be relied upon and should be independentlyverified with primary sources of information. Taylor and Francis shall not be liable for anylosses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilitieswhatsoever or howsoever caused arising directly or indirectly in connection with, in relation to orarising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any substantialor systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, ordistribution in any form to anyone is expressly forbidden. Terms & Conditions of access and usecan be found at http://www.tandfonline.com/page/terms-and-conditions

Featural processing in recognition of emotional facialexpressions

Olivia Beaudry1, Annie Roy-Charland1, Melanie Perron1, Isabelle Cormier2, andRoxane Tapp1

1Department of Psychology, Laurentian University, Sudbury, Ontario, Canada2Ecole de Psychologie, Université de Moncton, Moncton, New Brunswick, Canada

The present study aimed to clarify the role played by the eye/brow and mouth areas in the recognitionof the six basic emotions. In Experiment 1, accuracy was examined while participants viewed partialand full facial expressions; in Experiment 2, participants viewed full facial expressions while their eyemovements were recorded. Recognition rates were consistent with previous research: happiness washighest and fear was lowest. The mouth and eye/brow areas were not equally important for therecognition of all emotions. More precisely, while the mouth was revealed to be important in therecognition of happiness and the eye/brow area of sadness, results are not as consistent for the otheremotions. In Experiment 2, consistent with previous studies, the eyes/brows were fixated for longerperiods than the mouth for all emotions. Again, variations occurred as a function of the emotions, themouth having an important role in happiness and the eyes/brows in sadness. The general pattern ofresults for the other four emotions was inconsistent between the experiments as well as across differentmeasures. The complexity of the results suggests that the recognition process of emotional facialexpressions cannot be reduced to a simple feature processing or holistic processing for all emotions.

Keywords: Facial expressions; Recognition task; Eye-tracking; Featural processing.

Numerous papers have documented the functional

value of the face in transmitting information about

the emotional state of individuals (Ekman, 2003;

Russell & Fernadez-Dols, 1997). Available data

reveal that adults recognise emotions such as

happiness, anger, sadness, disgust, fear and sur-

prise at levels superior to those attributable to

chance (e.g., Ekman, 2003; Ekman & Friesen,

1971; Izard, 1971). Furthermore, the recognition

of these emotions is accomplished rapidly, effi-

ciently and with minimal cognitive resources

(Tracy & Robins, 2008). However, recognition

rates vary as a function of the emotion happiness

being typically identified more accurately, followed

by expressions of anger and surprise, whereas the

expressions of disgust, sadness and fear seem to

be more difficult to recognise (e.g., Ekman &

Friesen, 1971, 1986; Gosselin & Kirouac, 1995;

Correspondence should be addressed to: Olivia Beaudry, Department of Psychology, Laurentian University, 935 Ramsey Lake

Road, Sudbury, Ontario, Canada, P3E 2C6. E-mail: [email protected]

This research was supported by a CNFS grant to MP and AR-C and by an NSERC Discovery grant to AR-C.

We thank Krystle Lee-Turgeon, Kaylee Eady and Macha Roy for their assistance in running participants and in data coding.

© 2013 Taylor & Francis 416

COGNITION AND EMOTION, 2014

Vol. 28, No. 3, 416–432, http://dx.doi.org/10.1080/02699931.2013.833500

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Izard, 1994; Tracy & Robins, 2008; Vassallo,Cooper, & Douglas, 2009).

A major issue that has yet to be elucidated ishow emotional facial expression recognition isgoverned by perceptual mechanisms. On the onehand, some research has suggested that theinformation processing during emotional recogni-tion implies a holistic processing of the expressiveface, such as the processing of the structuralrelationships between features (e.g., Calder, Young,Keane, & Dean, 2000; Derntl, Seidel, Kainz, &Carbon, 2009; Prkachin, 2003).On the other hand,research has also supported the idea that facialexpression processing relies on individual features(e.g., Calvo, Nummenmaa, & Avero, 2010; Jack,Blais, Scheepers, Schyns, & Caldara, 2009; Martin,Slessor, Allen, Phillips, & Darling, 2012; Smith,Cottrell, Gosselin, & Schyns, 2005). More pre-cisely, certain areas of the face (e.g., the mouth area)would attract attention more than other areas. Theability to attract attention explains why certainexpressions are differentiated more efficiently andrapidly within an array of neutral stimuli (e.g.,Calvo & Nummenmaa, 2008). However, thefeatural processing within the context of emotionalrecognition is not well documented. For instance,while some studies provide support for the import-ance of featural processing in recognition of emo-tions in ageing (Murphy & Isaacowitz, 2010;Sullivan, Ruffman, & Hutton, 2007; Wong,Cronin-Golomb, & Neargarder, 2005) and inclinical populations (Baron-Cohen, Wheelwright,& Jolliffe, 1997; Horley, Williams, Gonsalvez, &Gordon, 2004; Williams, Loughland, Gordon, &Davidson, 1999), studies with regard to non-clinical populations suffer from limitations that donot allow a clear understanding of the role of suchprocessing as a function of the emotions. Forexample, because not all basic emotions are pre-sented and, typically, omitted emotions are thosethat are involved in confusions, results do notprovide a clear picture of difference in processingbetween emotions (e.g., Eisenbarth & Alpers,2011; Nusseck, Cunningham, Wallraven, &Bülthoff, 2008). Within the present study, themain goal was to examine the involvement ofperceptual processes in the recognition of

emotional facial expressions and thus examine towhat extent certain regions of the face are import-ant in the recognition of each emotion.

Featural and holistic accounts

According to the holistic account, structural rela-tions established between different components ofthe face (e.g., the relative shape and position of themouth to the eyes) are important in the identi-fication of emotional facial expressions (e.g.,Derntl et al., 2009). For instance, Derntl et al.(2009) compared accuracy and response timeobtained when facial expressions were presentedin their usual direction and when they werepresented upside down. This inversion preservesthe specific features, but prevents entire config-uration processing. Derntl et al.’s (2009) studyrevealed an inversion effect for anger, disgust andsadness. Confusions were found during the recog-nition task in the inverted condition, except forhappiness. Thus, identifying facial emotionalexpressions is affected by inversion, suggesting aholistic processing. In line with these findings,Calvo et al. (2010) observed an inversion effect forall basic emotions; however, happiness, surpriseand disgust were less affected by the inversion. Ageneral observation from the above-mentionedstudies is that decoding of the different emotionalexpressions is influenced unevenly by inversion,suggesting an importance of featural processing offacial expressions.

According to the featural account, the effec-tiveness of processing a facial expression is basedon single facial features, such as the pulling of thecorners of the mouth or the lowering of theeyebrows (Calvo & Nummenmaa, 2008). Theshape of a specific feature is particularly distinctivein certain expressions. It is thus possible that thisfeature has acquired the emotional value of thefacial expression in which it typically appears (seeMartin et al., 2012). For example, it is suggestedthat the shape of the mouth as a smile, because itis uniquely associated with expressions of happi-ness (Calvo & Marrero, 2009; Calvo et al., 2010),could be used as a cue to reduce the time necessaryto identify the expression (Leppänen & Hietanen,

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 417

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

2007; Martin et al., 2012). The objective of thecurrent studies is to clarify the role of certain areasof the face containing emotionally relevant featureswith respect to the recognition process of the sixbasic emotions.

Role of specific areas

The mouth and eye/brow areas have been docu-mented as relevant with regard to emotionalinformation (Ekman & Friesen, 1969). Theoret-ical proposals of emotional facial expressionssuggest that both of these regions comprisenumerous muscular activations conveying emo-tional content (Ekman & Friesen, 1978). Forexample, fear is associated with the opening ofthe eyes, the raising of the inner and outer parts ofthe eyebrows and the contracting of the brows.Activation of other muscles not directly in the eye/brow region can also have repercussions in theappearance changes in this region. For example,the wrinkling of the nose in disgust will oftenpassively lower the brows. The lifting of thecheeks in happiness narrows the eyes, createsbulges underneath the eye and crows feet at theouter corner of the eye. The mouth region is alsoassociated with extensive appearance changesassociated with emotional expressions. For fear,changes include the opening of the mouth as wellas the stretching of the corner of the mouthoutward; and for anger, the tightening of the lipsand the contraction of the chin produce changes inthe mouth area as well.

The importance of the mouth and eye/browareas in attracting attention has been documentedin studies using a visual search paradigm. In thisparadigm, the stimulus displays involve one dis-crepant emotional facial expression among sixneutral expressions and participant are asked toconfirm the presence of an expressive face withoutspecifying the emotion expressed. To determinethe sufficiency of the mouth and eye/brow areas,Calvo and Nummenmaa (2008) presented onlythe eye/brow or only the mouth area and com-pared it to the full-face condition. An area wasconsidered sufficient if its individual presentationwas as efficient and accurate as the presentation of

the entire face. Participants were more accurate indifferentiating anger, disgust and fear, followed bysadness and surprise; and accuracy was poorest forhappiness when the eye/brow area was presentedalone. For the sufficiency criteria, participantswere less effective in differentiating emotionalfacial expressions when the eye/brow area waspresented alone compared to the entire face, withhappiness, surprise and fear being the mostaffected. These results suggest that the eye/browarea is not sufficient to detect emotional facialexpressions and happiness seems to be the mostaffected when this area is presented alone. Whenthe mouth was the only area presented, theaccuracy rate was highest for happiness, surprise,disgust and fear compared to anger, and accuracywas the worst for sadness. The mouth was aseffective as the whole face in the detection of allemotions but sadness.

Calvo and Nummenmaa (2008) also exploredthe necessity of the mouth and eye/brow areas. Todetermine whether a specific area is necessary fordifferentiating an emotional expression, they pre-sented faces with either the eye/brow or the moutharea hidden. If an area is necessary, its removalshould affect accuracy, compared to when theentire face is presented. Results revealed that byhiding the mouth, the advantage of detection forhappiness and surprise disappeared, suggestingthat the mouth is necessary for the detection ofthese emotional expressions. When the eye/browarea was hidden, results revealed that it wasnecessary for differentiating sadness. For the otheremotions, hiding the eyes/brows seemed to havelittle impact. It therefore appears that the moutharea is of greater importance compared to the eye/brow area, being not only sufficient but necessaryas well. Although interesting, these results arelimited to the visual search task. The properties ofparticular areas to attract attention may notgeneralise to the recognition of the specific emo-tional facial expression.

With respect to the recognition of emotionalfacial expressions, the sufficiency and necessity ofthe mouth and eye/brow areas are not as clearlydocumented as a function of the emotions. Pre-sentation of the eye/brow and the mouth areas as

BEAUDRY ET AL.

418 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

well as the full-face condition was used in Sullivanet al.’s (2007) study in order to explore their rolesin a recognition task. However, results are difficultto interpret with regard to the role of regions inrecognising specific emotions because the mainpurpose was to compare young adults with olderadults. Nevertheless, their findings indicated thatthe eyes/brows were more important than themouth in the recognition of anger, fear andsadness. The mouth was more important thanthe eyes/brows in recognising disgust. Both ofthese areas were equally important in recognisinghappiness and surprise. Although this study wasinformative on the role of the eye/brow and mouthareas in a recognition task, no analyses wereconducted in order to compare these areas to thefull-face condition, meaning that the sufficiency ofthese areas was not fully explored.

Calder et al. (2000) explored the sufficiencycriterion by presenting the complete, upper halfand lower half of facial stimuli during a recogni-tion task. The upper half of the face wassufficient for decoding anger, fear and sadness,while the lower half was sufficient for decodinghappiness and disgust. The two halves weresufficient for decoding surprise. The processingof both halves of the face thus seems to show adifferent sufficiency importance as a function ofthe emotion. Furthermore, differences wereobserved in the results between this study andthat of Sullivan et al. (2007) for certain emotions,specifically for happiness. Consequently, varia-tions between the studies do not allow a clearunderstanding of the impact of eye/brow andmouth areas in the recognition process. Inaddition, these authors did not explore thenecessity of these same areas.

To the best of our knowledge only a singlestudy has explored the necessity criterion inrecognition. Nusseck et al. (2008) investigatedthe necessity of the mouth and eye/brow areas byusing dynamic facial expressions for which theseregions were neutralised (frozen in order toprevent their movement). The mouth was neces-sary for the recognition of happiness and surpriseand the eyes/brows and the mouth were bothimportant in recognising sadness and disgust.

However, this study had important limitations.First, it did not use all six basic emotions, which issignificant since the two emotions omitted wereanger and fear, which are involved in confusions(Gosselin & Kirouac, 1995). Most importantly,their methodology might also have obscured therecognition process: rather than hiding an area ofthe face, they prevented the movement of parts.While this strategy was used to increase theecological value of the stimuli, neutralising anarea could have transmitted other informationrather than simply limiting the information avail-able. In fact, this strategy could have looked from adecoder point of view like an attempt to controlemotional expressions (e.g., hiding negative emo-tion; Ekman, Friesen, & O’Sullivan, 1988; Porter& ten Brinke, 2008). Because the literaturesuggests that, in general, adults seem to havedifficulty identifying the emotion expressed whena person tries to conceal an emotion (e.g., Ekman,O’Sullivan, & Frank, 1999), this could be anotherexplanation for the low accuracy in their study. Itthus remains unclear whether the results are in factattributable to the necessity of these regions or tothe complexity of the task associated with thenature of their stimuli.

In sum, available data suffer from limitationsthat do not allow a systematic examination of therole of the mouth and eye/brow areas in recogni-tion, as was done in visual search. Regarding thesufficiency of these areas, inconsistencies in theresults concerning each emotion require a morecomprehensive examination. Further studies arealso needed to explore the necessity of theseregions in the recognition of facial expressions asa function of the six emotions while clearly hidinginformation. The main goal of the present study isto clarify the role of the mouth and eyes/brows inan emotion facial recognition task.

EXPERIMENT 1

In order to explore the sufficiency of the mouthand eye/brow areas in a recognition task, stimulicontaining only these areas were presented toparticipants who were instructed to identify the

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 419

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

emotion expressed. Based on visual search data(Calvo & Nummenmaa, 2008), a sufficiencycriterion could be expected for the mouth areafor most emotions. Participants should be moreaccurate at identifying all of the emotions whenonly the mouth area is presented alone but notwhen the eye/brow area is presented alone. How-ever, based on Calder et al.’s (2000) results, it canbe predicted that the mouth is sufficient for therecognition of happiness and disgust and the eyes/brows are sufficient to identify sadness, anger andfear. No difference is to be expected between themouth and eye/brow regions for surprise. Sullivanet al. (2007) observed a similar pattern of resultswith the exception of the importance of the mouthfor happiness.

To examine the necessity of an area, stimuliwith the areas hidden were compared to the full-face presentations. If one part of the face isimportant for recognising the emotion, its removalwould impair the recognition process. Predictionswith regard to the necessity are also mostly basedon previous research in visual search (Calvo &Nummenmaa, 2008). It was observed that themouth was necessary for all emotions but sadness,for which the eyes/brows had that role. Inrecognition, Nusseck et al. (2008) also found thatthe mouth was necessary to recognise happinessand surprise and that both the mouth and eyes/brows were necessary to recognise sadness anddisgust.

In the current experiment, both the presenta-tion of a single feature or its removal is tappinginto featural processing, but also allow an exam-ination of holistic processing. In fact, when thewhole face presentation is modified, either bypresenting a single feature or by removing afeature, we prevent holistic processing (Calderet al., 2000; Calder & Jansen, 2005; Calvo &Nummenmaa, 2008; Maurer, Le Grand, &Mondloch, 2002). Consequently, if emotionalfacial expression recognition is governed byholistic processing, we stipulate that neither ofthe areas would be sufficient for the recognitionof the emotions but they would be necessaryfor all.

Method

Participants. Thirty-two students (Mage = 20years; 28 women) participated in the first part ofthis experiment. Participants were exposed to thefour conditions: the mouth only, the eyes/browsonly, the eyes/brows hidden and the mouthhidden. A second group of 32 participants (Mage

= 22 years; 29 women) was presented with theentire face condition.

Material. Stimuli were 24 Caucasian emotionalfacial expressions from the Japanese and Caucasian

Facial Expressions of Emotion (JACFEE; Matsu-moto & Ekman, 1989). The six basic emotions(anger, disgust, fear, surprise, sadness and happi-ness) were represented twice for each gender (fouroccurrences of each emotion). For the entire facecondition, participants were presented with these24 pictures. The original pictures were modified toobtain the other four conditions: eyes/brows only,mouth only, eyes/brows hidden and mouth hid-den. The hidden areas and the single areaspresented were exactly the same size. The originalposition was respected for the images showingonly the eye/brow or mouth area (see Figure 1 forexamples of stimuli). Condition order was coun-terbalanced between participants using a Latinsquare. Within a condition, the pictures wererandomised.

Procedure. Participants were seated 60 cm from acomputer monitor. For every condition, they wereinformed that they would be exposed to a total of24 pictures and were required to identify theemotion. The pictures were presented one at atime and participants controlled viewing time.Once they pressed the mouse button, a whitescreen appeared. They were then asked to select anemotion from a list of ten emotions (happiness,disgust, anger, surprise, sadness, fear, contempt,guilt, shame and interest). The participants alsohad the option to propose an emotion that was notin the list in order to avoid forced attribution tothe six proposed emotions (Russell, 1993). Afterthe response was recorded, a new picture waspresented.

BEAUDRY ET AL.

420 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Results

For all analyses, an alpha level of .05 was usedunless otherwise indicated. In all conditions,accuracy was computed by dividing the numberof correct responses by the number of occurrencesof the target emotions. It should be noted that inaddition to analyses presented in this section,individual accuracy levels were compared to levelsattributable to chance. Chance level used in theanalyses was a proportion of .10. Since all analysesrevealed levels significantly higher than chance, forsake of brevity, they will not be included in theresults section. When comparisons were madebetween the individual areas (mouth only andeyes/brows only) as well as when a specific areawas hidden (mouth hidden and eyes/brows hid-den), 2 × 6 repeated-measures analyses of variance(ANOVAs) were computed. In effect, the sameparticipants were exposed to these four conditionsand to the six basic emotions. However, whencomparisons were made with the full-face, 2 × 6mixed-model ANOVAs were used because differ-ent participants were exposed to the full-facecondition.

Sufficiency criteria. For the comparison betweeneye/brow area only and the entire face (Table 1),results revealed a main effect for the Emotion, F(5, 310) = 12.07, ηp2 = .16, and a main effect forthe Area Condition, F(1, 62) = 50.63, ηp2 = .45.The interaction between the Area Condition andthe Emotion was also significant, F(5, 310) =3.47, ηp2 = .05. For simple main effect tests,Dunn’s correction was applied to alpha levels, thusto be considered significant the p-value had to be

smaller than .01. Simple main effect tests revealedhigher accuracy at identifying fear, F(1, 372) =39.83, disgust, F(1, 372) = 27.85, happiness, F(1,372) = 20.22, anger, F(1, 372) = 9.47, andsurprise, F(1, 372) = 6.78, when the entire facewas presented, compared to when the eye/browarea was presented, but not for sadness, F(1, 372)= 3.58, p = .06. Both for the entire face conditionand for the eye/brow area only, significant differ-ences were also obtained between emotions, F(5,310) = 3.15 and F(5, 310) = 12.40, respectively.For the entire face condition, post hoc tests(Tukey, Honestly Significant Difference test;HSD) revealed that the proportion of accurateresponses for happiness was significantly higherthan for fear. For the eye/brow area condition,post hoc tests (Tukey, HSD) revealed that theproportion of accurate responses for fear wassignificantly lower than for all emotions, exceptdisgust.

When comparing the mouth only to the entireface (Table 1), results revealed a main effect forthe Emotion, F(5, 310) = 27.09, ηp2 = .30, a maineffect for the Area Condition, F(1, 62) = 59.85,ηp2 = .49, and a significant interaction, F(5, 310) =9.89, ηp2 = .14. Again, for simple main effects,Dunn’s correction was applied to alpha levels (α =.01). Simple main effect tests revealed a higheraccuracy in identifying fear, F(1, 372) = 62.98,sadness, F(1, 372) = 41.11, surprise, F(1, 372) =15.99, and anger, F(1, 372) = 3.89, when theentire face was presented compared to when themouth area was presented. No difference betweenconditions was observed for disgust, F(1, 372) =1.16, p = .28, or for happiness, F < 1. Further-more, for the mouth only condition, significant

Figure 1. Examples of stimuli. JACFEE. © [David Matsumoto]. Reproduced by permission of David Matsumoto.

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 421

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

differences were observed between emotions, F(5,310) = 33.36. Post hoc tests (Tukey, HSD)revealed that the proportion of accurate responsesfor happiness was significantly higher than for allother emotions. The proportion of accurateresponses for disgust, anger and surprise was alsosignificantly higher than for sadness and fear.

Finally, when comparing the eye/brow only tothe mouth only condition (Table 1), resultsrevealed a main effect for the Emotion, F(5, 155)= 30.04, ηp2 = .49, and a significant interaction,F(5, 155) = 12.29, ηp2 = .28. However, the maineffect for the Area Condition was not significant,F(1, 31) = 2.32, p = .14. Simple main effect tests(α = .01) revealed that accuracy was significantlyhigher for the mouth only than for the eye/browonly area for disgust, F(1, 186) = 20.40, happiness,F(1, 186) = 18.87, and significantly higher for theeye/brow only than for the mouth only area forsadness, F(1, 186) = 15.27. No other differencewas observed, for the sake of brevity, all Fs < 1.96,p > .16.

Necessity criteria. For the comparison betweenthe face without the eye/brow area and the entireface (Table 1), results revealed a main effect forthe Emotion, F(5, 310) = 27.19, ηp2 = .31, a maineffect for the Area Condition, F(1, 62) = 36.14,ηp2 = .37 and a significant interaction, F(5, 310) =11.20, ηp2 = .15. Simple main effect tests (α = .01)

revealed that accuracy was higher in the entire facecondition than for the face without the eye/browarea for sadness, F(1, 372) = 68.92, fear, F(1, 372)= 17.78, and anger, F(1, 372) = 11.74. Nosignificant differences were observed for the otheremotions, for the sake of brevity, all Fs < 1.Furthermore, a significant difference was obtainedbetween emotions, F(5, 310) = 34.95. Post hoctests (Tukey, HSD) revealed that, when the eye/brow area was removed, the proportion of accurateresponses was significantly lower for sadness thanfor all emotions, except fear and the accuracy levelfor fear and anger was also significantly lower thanfor surprise, disgust and happiness.

When comparing the face without the mouthto the full-face (Table 1), results revealed a maineffect for the Emotion, F(5, 310) = 12.32, ηp2 =.17, a main effect for the Area Condition, F(1, 62)= 44.37, ηp2 = .42, and a significant interaction,F(5, 310) = 2.29, ηp2 = .04. Simple main effecttests (α = .01) revealed that accuracy was signifi-cantly higher for the entire face condition than forthe face without the mouth area for fear, F(1, 372)= 26.51, disgust, F(1, 372) = 21.73, sadness, F(1,372) = 12.45, and happiness, F(1, 372) = 11.57.However, no significant difference was observedfor anger, F(1, 372) = 3.35, p = .07, or for surprise,F(1, 372) = 2.68, p = .10. Furthermore, asignificant difference was found between emotions

Table 1. The proportion of correct responses and standard deviations as a function of the emotion and of the entireface, the area presented or the hidden area for Experiment 1 and entire face with intensity level collapsed forExperiment 2

Emotions

Area presented Anger Disgust Happiness Fear Surprise Sadness

Experiment 1 Proportion of correct responses

Eyes/brows 0.59 (0.32) 0.43 (0.30) 0.66 (0.31) 0.33 (0.26) 0.70 (0.27) 0.67 (0.30)Mouth 0.68 (0.24) 0.72 (0.24) 0.93 (0.13) 0.28 (0.24) 0.63 (0.24) 0.42 (0.29)Without eyes/brows

0.59 (0.32) 0.77 (0.21) 0.96 (0.09) 0.49 (0.32) 0.83 (0.18) 0.30 (0.26)

Without mouth 0.68 (0.28) 0.49 (0.28) 0.74 (0.23) 0.42 (0.28) 0.77 (0.22) 0.57 (0.29)Entire face 0.80 (0.25) 0.78 (0.29) 0.95 (0.12) 0.74 (0.28) 0.87 (0.18) 0.80 (0.21)Experiment 2

Entire face 0.51 (0.20) 0.54 (0.21) 0.75 (0.12) 0.32 (0.12) 0.51 (0.12) 0.49 (0.14)

BEAUDRY ET AL.

422 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

when the mouth was hidden, F(5, 310) = 11.28.Post hoc tests (Tukey, HSD) revealed that theproportion of accurate responses for surprise wassignificantly higher than for sadness, disgust andfear. The proportion of accurate responses forhappiness and anger was also significantly higherthan for disgust and fear.

Finally, when comparing the face without theeye/brow area to the face without the mouth area(Table 1), the results revealed a main effect for theEmotion, F(5, 155) = 28.42, ηp2 = .48, and asignificant interaction, F(5, 155) = 13.56, ηp2 =.30, but that was not the case for the AreaCondition, F(1, 31) = 3.70, p = .06. Simple maineffect tests (α = .01) revealed that accuracy wassignificantly higher when the face without the eye/brow area was presented than the face without themouth area for disgust, F(1, 186) = 25.68, andhappiness, F(1, 186) = 15.53. For sadness, F(1,186) = 24.27, the opposite pattern was observed.Accuracy was higher when the mouth area wasremoved than when the eye/brow area wasremoved. No difference was observed for the otheremotions, for the sake of brevity, all Fs < 2.60,p > .11.

Discussion

The main objective of Experiment 1 was toexamine the sufficiency and necessity of the eye/brow and mouth areas in the recognition ofemotional facial expressions. An area is consideredsufficient if accuracy is similar to that obtainedwith the entire face presentation. Similar toprevious results in visual search (Calvo & Num-menmaa, 2008) and to Calder et al.’s (2000) studyin recognition, the mouth area was sufficient inthe recognition of happiness and disgust and theeye/brow area was sufficient for sadness. Further-more, while Calder et al. (2000) observed that theupper part of the face was sufficient for fear andanger, current results do not suggest that the eye/brow area is sufficient for the recognition of thesetwo emotions.

Rather than comparing areas to the full-face,Sullivan et al. (2007) compared the recognitionof emotions for the mouth area to the eye/brow

area and vice versa. We also examined thisquestion. The mouth area was more importantthan the eye/brow area for happiness and disgustand the eye/brow area was more important forsadness. While they observed that the eye/browarea was more important than the mouth areafor fear and anger, we did not observe thisdifference.

An area is considered necessary if accuracy issignificantly lower when it is removed. Theresults provide a more complex pattern for thiscriterion than for sufficiency. With regard tohappiness and disgust, results for necessity werein line with those for the sufficiency criterion.When the mouth was hidden, the accuracy ratewas significantly lower for these emotions com-pared to the full-face presentation. Furthermore,results showed that the removal of the mouthproduced lower accuracy levels than the removalof the eye/brow area. These results are partiallycompatible with Nusseck et al. (2008), whoexamined necessity by neutralising the areas.They observed that the mouth was necessaryfor the recognition of happiness, but for disgust,both the mouth and eyes/brows were necessary.Here, the eye/brow area did not show thisimportance for disgust.

The removal of the eye/brow area had asignificant impact on the recognition of sadness.Calvo et al. (2008) had observed a similar patternin a visual search task. However, the currentresults also show that the mouth area was neces-sary for sadness as well. Nusseck et al. (2008) hadshown the necessity of both these areas for therecognition of sadness. Nevertheless, resultsrevealed that hiding the eye/brow area was asso-ciated with lower accuracy levels than hiding themouth, again highlighting the importance of theeye/brow area in the recognition of sadness overand above the role of the mouth area.

The removal of the eye/brow area also impairedthe recognition of anger and fear. While the eye/brow area was not sufficient to recognise theseemotions, it was necessary as its removal impactedrecognition rates. However, we also observed thatthe mouth area was a necessary component forrecognition of fear but not for anger. Furthermore,

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 423

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

when comparing the impact of removing the eye/brow area to removing the mouth area, nodifference was found. Thus, the current resultsdo not provide support for the importance of onearea over the other for these emotions. Finally, forsurprise neither of the areas were sufficient nornecessary.

The current study exemplifies the complexityof emotional facial expression recognition. Theresults do not simplify the recognition process asbeing uniquely featural for all emotion, at leastwith regard to the eye/brow and mouth area. Ineffect, for emotions such as happiness, disgustand to some extent sadness, it could be arguedthat their recognition relies on featural proces-sing of one of these areas. However, for fear,anger and surprise, the same conclusions cannotbe drawn. Since results do not suggest that eitherthe eye/brows or the mouth is the sole cue in therecognition of these emotions, one might stipu-late a holistic processing for these emotions. Thisexplanation is still not fully satisfactory in thosecases because it would be predicted that neitherof the areas would be sufficient and both wouldbe necessary for the recognition according to aholistic account. While neither of the areas wassufficient for the recognition of surprise, angerand fear, neither of these areas was necessary torecognise surprise and the mouth was not neces-sary to recognise anger. The only emotion forwhich a holistic account might be called uponwas fear. Further theoretical implications ofthese results will be provided in the generaldiscussion.

EXPERIMENT 2

Although the results of Experiment 1 provideinsight on the use of the mouth and eye/browareas in emotional facial recognition, furtherexaminations are necessary to support the gener-alisability of the results to a more ecologicalcontext. In fact, within the previous experiment,we showed the importance of these features ascues that are as important as the whole face in therecognition of certain emotions. However, in order

to fully examine their importance, the use of these

features in recognition should be demonstrated

when all information is available. For example, do

individuals rely on the mouth area to recognise

happiness when the entire face in presented? The

goal of Experiment 2 is to extend previous results

by examining the importance of the mouth and

eye/brow areas in the recognition of emotional

facial expression when the entire face is presented

by using eye-movement recording.

Previous studies using eye-movement monitor-

ing revealed that participants make significantly

more and longer fixations on the eye/brow area

than on the mouth for all emotions. However, the

extent of the results is limited by a variety of

factors. For instance, in most circumstances the

researchers did not clearly compare the emotions

(see Sullivan et al., 2007). This comparison seems

to be an important issue since we have shown

differences in the use of specific regions as a

function of the emotion. In other circumstances,

the authors did not include all of the basic

emotions (Eisenbarth & Alpers, 2011; Ellison &

Massaro, 1997; Murphy & Isaacowitz, 2010),

were completed with other objectives such as

studying ageing (Murphy & Isaacowitz, 2010;

Sullivan et al., 2007; Wong et al., 2005) or clinical

populations (Baron-Cohen et al., 1997; Horley

et al., 2004; Williams et al., 1999). In some

studies, authors were not interested in recognition

of specific emotions but in the effect of valence or

arousal (e.g., Eisenbarth & Alpers, 2011). By

recording eye movements in a recognition task,

the current experiment provided a systematic

examination of the use of the eye/brow and mouth

areas for all six basic emotions when all informa-

tion was available. The current study also extends

previous research by using stimuli of varying

intensity levels in order to reduce ceiling effects

in the recognition rates of some emotions (see, e.

g., Sullivan et al., 2007; Wong et al., 2005) and to

enhance the ecological value (Gosselin & Pélissier,

1996). Thus, we also examined the use of the eye/

brow and mouth areas in emotional facial recog-

nition using more ecologically valid stimuli.

BEAUDRY ET AL.

424 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Method

Participants. Thirty-two undergraduate students(Mage = 22 years; 29 women) participated in theexperiment. All reported normal or corrected-to-normal vision.Materials and procedure. The same 24 picturesfrom the JACFEE used in Experiment 1 wereused here (Matsumoto & Ekman, 1989). Thepictures were modified using the Morpheus 7.0program. The emotional expressions were alteredin order to obtain four levels of varying intensities(20%, 30%, 50% and 100%) for each emotion,which produced a total of 96 stimuli. Procedurewas identical to that used in Experiment 1, exceptfor a brief period prior to the experimentation thatwas used to set up the eye-tracking apparatus.Apparatus. Eye movements were recorded withthe Eyelink II system (SR-Research Ltd, Mis-sissauga, ON, Canada). This apparatus is a highlyaccurate system (<0.5°) and also has a highsampling rate (500 Hz). The apparatus has twocameras located under the participant’s eyes andan infrared sensor located on the forehead. Theforehead sensor allows head tracking for headmovement compensation. One pupil was trackedin the current study and eye selection wasdetermined by the most accurate calibrationbetween the two pupils. After calibration wasestablished, participants were exposed to thestimuli on a 21-inch ViewSonic monitor; at thesame time, the experimenter’s monitor displayedthe participant’s gaze position. The gaze positionwas displayed by a one degree in diameter gazecursor, which allows examination of the system’saccuracy.

Results

Accuracy. For each emotion (anger, disgust, hap-piness, fear, surprise and sadness), the proportionof accurate responses was computed by dividingthe number of correct responses by the number ofoccurrences of the stimuli (Table 1). Data forintensity level was collapsed. A repeated-measuresANOVA revealed a significant effect of Emotion,

F(5, 31) = 31.38, ηp2 = .50. Post hoc tests (Tukey,HSD) revealed that the accuracy for happiness wassignificantly higher than for all the other emotionsand accuracy for fear was significantly lower thanfor all the other emotions. No other difference wassignificant.

Eye movements. Eye movements were scoredwith the EyeLink Dataviewer, which superim-poses the fixations on presented stimuli. Two eyemovement measures were examined. First, thetiming of the initial orientation in the mouth andin the eye/brow areas was computed from theonset of the picture presentation until the partici-pant’s eye fixated in the specified zone for the firsttime. Furthermore, the time spent on the eye/brow and mouth areas was examined by comput-ing the proportion of time spent in these zones.The proportion of time spent in the zone (mouthand eyes/brows) was measured by dividing thetime spent in the zone by the total dwell time onthe stimulus. This measure was selected instead oftotal dwell time because presentation time wascontrolled by participants. Participants pressed onthe mouse key when they were ready to give theirresponse; therefore dwell time varied from trial totrial, with some trials yielding longer and othersshorter presentation times. The proportion of timein zones has been used in previous studies in theemotional facial expression field to control for theabove-mentioned factors (Boraston, Corden,Miles, Skuse, & Blakemore, 2008).

Analyses were computed with the Zone (moutharea and eye/brow area), the Accuracy Level(accurate or inaccurate responses) and the Emo-tion (anger, disgust, happiness, fear, surprise andsadness) as within-subject factors with intensitylevels collapsed. All analyses including thesefactors were based on 24 of the 32 participantsbecause of empty cells due to participants eitheronly producing accurate responses or only inac-curate responses for some of the conditions. Forinitial orientation (Table 2), the 2 × 2 × 6ANOVA revealed a main effect of Emotion,F(5, 115) = 5.17, ηp2 = .18, a main effect ofZone, F(1, 23) = 6.86, ηp2 = .23, and a main effectof Accuracy, F(1, 23) = 4.99, ηp2 = .18. There wasalso a significant interaction between Emotion and

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 425

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Zone, F(5, 115) = 5.29, ηp2 = .19, and betweenAccuracy and Zone, F(1, 23) = 5.47, ηp2 = .19.The other interactions did not reach significance,Fs < 1. For the interaction between Emotion andZone, simple main effect tests (α = .01) revealedthat participants looked more rapidly in the eye/brow area than in the mouth area for disgust, F(1,138) = 13.99, sadness, F(1, 138) = 13.88, and fear,F(1, 138) = 3.97, but that was not the case for theother emotions, for the sake of brevity, all Fs <3.39 and p > .07. For the mouth area, a significantdifference was observed between Emotion, F(5,230) = 10.50, whereas no significant differencewere found between emotions for the eye/browarea, F < 1. For the mouth area, post hoc tests(Tukey, HSD) revealed that the initial orientationtime was faster for surprise and happiness than fordisgust and sadness. For the interaction betweenAccuracy and Zone, simple main effect tests (α =.03) revealed that for inaccurate responses initialorientation was faster for the eye/brow area thanfor the mouth area, F(1, 46) = 10.05, but that wasnot the case for accurate responses, F(1, 46) =2.95, p = .09. Furthermore, for the mouth areainitial orientation was faster for accurate thaninaccurate responses, F(1, 46) = 12.04. For theeye/brow area, no significant difference wasobserved between accurate and inaccurateresponses, F < 1.

For the proportion of time spent in the zone(Table 2), the 2 × 2 × 6 ANOVA revealed a maineffect of the Emotion, F(5, 115) = 2.87, ηp2 = .11,a main effect of Zone, F(1, 23) = 95.09, ηp2 = .81,and a main effect of Accuracy, F(1, 23) = 11.67,ηp2 = .34. A significant interaction was observedbetween Emotion and Zone, F(5, 115) = 5.28, ηp2

= .19, but the other interactions did not reachsignificance, Fs < 2.01, p > .08. Simple main effecttests (α = .01) revealed that participants spentmore time in the eye/brow area than the moutharea for all emotions: sadness, F(1, 138) = 104.56,fear, F(1, 138) = 88.75, disgust, F(1, 138) = 79.29,anger, F(1, 138) = 78.14, surprise, F(1, 138) =73.83, and happiness, F(1, 138) = 53.20. For themouth area and the eye/brow area, significantdifferences were observed between emotions; F(5,230) = 7.60 and F(5, 230) = 3.23, respectively.T

able2.Meansandstandard

deviationsfortheinitialorientation

time(m

s)andtheproportion

oftimeproportion

asafunctionoftheem

otion,thezone(m

outh

area

andtheeye/brow

sarea)andtheaccuracy

level(accuratevs.inaccurate)

Emotions

Area

Accuracy

Anger

Disgust

Happiness

Fear

Surprise

Sadness

Initialorientation

time

Eyes/brows

Accurate

754(426)

704(334)

685(295)

680(487)

763(618)

695(497)

Inaccurate

706(462)

725(465)

652(450)

681(409)

799(454)

662(411)

Mouth

Accurate

1,137(943)

1,278(1,121)

806(529)

759(386)

839(570)

1,213(710)

Inaccurate

1,037(727)

1,613(1,145)

983(817)

1,198(656)

920(481)

1,450(1,471)

Proportionoftime

Eyes/brows

Accurate

0.44(0.14)

0.42(0.14)

0.44(0.13)

0.45(0.17)

0.47(0.15)

0.48(0.13)

Inaccurate

0.42(0.11)

0.45(0.14)

0.42(0.14)

0.45(0.13)

0.42(0.09)

0.46(0.14)

Mouth

Accurate

0.16(0.06)

0.17(0.06)

0.22(0.10)

0.16(0.07)

0.19(0.07)

0.15(0.08)

Inaccurate

0.15(0.07)

0.14(0.06)

0.19(0.09)

0.16(0.06)

0.17(0.07)

0.15(0.18)

BEAUDRY ET AL.

426 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Post hoc tests (Tukey, HSD) revealed that theproportion of time spent in the mouth area forhappiness was significantly longer than for all theother emotions. Post hoc tests (Tukey, HSD) alsorevealed that the proportion of time spent in theeye/brow area was longer for sadness than happi-ness, disgust and anger.

Discussion

The results of Experiment 2, in terms of accuracy,are consistent both with those of Experiment 1and of previous research (e.g., Gosselin & Kirouac,1995; Gosselin & Pélissier, 1996). Results againshow an advantage of happiness over the otheremotions and a disadvantage for fear. The pro-portion of time spent in the eye/brow areacompared to the mouth area is also consistentwith previous findings. In effect, while in visualsearch paradigms (Calvo & Nummenmaa, 2008)results have typically shown an advantage of themouth over and above the role of the eyes/browsin attracting attention, results in recognitiontypically show that the eyes/brows are fixated forlonger periods (see, e.g., Sullivan et al., 2007).Such findings suggest that the cognitive processesinvolved in a visual search paradigm and in therecognition process might be different.

More importantly, eye-movement recordingssupport the differential role of the eyes/browsand mouth in the recognition of facial expres-sions when the entire face is available as afunction of the emotion. The patterns of resultsfor happiness are robust. Regardless of itsadvantage with regard to accuracy, the moutharea is consistently important showing its suffi-ciency and necessity but, more importantly, theeffect remains in its ability to attract attentionand the time spent in this area. A similar patternis observed in the importance of the eye/browarea in the recognition of sadness. When lookingat a sad face, the eyes/brows tend to attractattention more quickly and more time is alsospent gazing at this area. Results for the otheremotions are not as consistent. The findings willbe discussed in more detail subsequently.

GENERAL DISCUSSION

The present study is consistent with the literatureas patterns of recognition as a function of theemotion reproduce basic effects that have beenobserved (Ekman & Friesen, 1971, 1986; Gosse-lin & Kirouac, 1995; Izard, 1994; Tracy &Robins, 2008; Vassallo et al., 2009; Widen &Russell, 2003, 2008). Happiness held an advant-age over all other emotions, while fear was lessaccurately identified. Surprise and sadness alsoseemed to hold their status, surprise being one ofthe most accurately identified emotions and sad-ness one of the least (see, e.g., Wiggers, 1982). Insum, this study found that in recognition ofemotional facial expression, the target emotion isan important factor to be considered and futureresearch should explore reasons underlining thesedifferences.

The role of the mouth and eye/brow areas inemotional facial recognition

In order to explain the visual processing of anemotional facial expression, two accounts havebeen put forward, the holistic and the featural(Calder et al., 2000; Calvo et al., 2010; Derntlet al., 2009; Prkachin, 2003). The holistic accountemphasises the importance of structural relation-ships established between different components ofthe face in the identification of emotional facialexpressions. The featural account proposes thatthe processing of a facial expression could rely on asingle facial feature that might have acquired theemotional value of the facial expression in which ittypically appears. The objective of the currentresearch was to clarify the role of the mouth andeye/brow areas in the recognition of the six basicemotions. Thus, we systematically examined thesufficiency and the necessity of these areas as wellas the eye-movement pattern used in this process.

The current results expose the complexity ofthe visual processes associated with emotionalfacial expression recognition. The processing can-not be simplified as being uniquely attributable tosingle feature identification, nor does it alwaysimply a holistic treatment for all emotions.

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 427

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Furthermore, neither can we suggest that one ofthe areas, the mouth or the eye/brow, holds apreferential value over the other for all emotions.Results will be discussed as a function of theemotion in order to offer a comprehensive pictureof the importance of each or both of these regionsin the recognition process.

The current study provides clear support for therole of the mouth in the recognition of happiness.In Experiment 1, the mouth area alone wassufficient for its recognition. The removal of thisarea affected the recognition process, thus alsosupporting its necessity. In Experiment 2, themouth area attracted attention more quickly andparticipants spent more time in this zone than inthe eye/brow area and more time in the moutharea for happiness than for the other emotions.Regardless of slight differences in patterns ofresults in the literature, the results of the currentstudy are in line with those of previous researchwith regard to the advantage of the mouth area forhappiness (Calvo et al., 2008; Calder et al., 2000;Nusseck et al., 2008; Sullivan et al., 2007).However, the results expanded on previous obser-vations by providing a clear link between suffi-ciency, necessity and patterns of eye movementsrequired in the processing. The time spent in azone could have been indicative of difficulty in theprocessing of this area or it could have representedthe importance of this same area in the recognitionprocess. Results from the two experiments sug-gested that the time spent in an area is represent-ative of its importance. For happiness, since themouth area was both necessary and sufficient andparticipants spent more time in this zone, we canconclude that this feature is important and mayserve as a shortcut to access the meaning ofhappiness. The lip corner puller producing a smile,could be employed as a cue in order to diminishthe time required in recognising this emotion(Calvo & Marrero, 2009; Calvo et al., 2010).

Another contribution of this study is thefinding that the eye/brow area has a significantrole in identifying sadness. Results have shownthat the eye/brow area is sufficient to recognisesadness and that its removal negatively affects itsrecognition, indicating its necessity. However, the

removal of the mouth area also produced loweraccuracy levels in recognising sadness, thus sug-gesting the necessity of the mouth as well.Nevertheless, when the removals of these areaswere directly compared, removing the eye/browarea was shown to more significantly affect therecognition process than removing the mouth area.Furthermore, eye-movement patterns showed thatthe eye/brow area attracted attention more quicklyfor sadness and more time was spent in this areathan for the mouth area as well as for most of theother emotions. Thus, the eye/brow area wassufficient and necessary to recognise sadness andthe eye-movement pattern supported itsimportance.

Although the results are in line with previousresearch (e.g., Eisenbarth & Alpers, 2011), thereasons underlying featural processing seem lessevident for sadness than for happiness. While theshape of the mouth as a smile is unique andrepresents a distinctive feature for happiness, theeye/brow area does not contain a unique feature ofsadness. Facial movements (e.g., inner brow raiserand brow lowerer) involved in sadness are alsofound in fear, surprise and anger (Ekman &Friesen, 1978). It should be noted that the entirecombination of the action units involved forsadness, fear, surprise and anger is slightly differ-ent. Nevertheless, it is more difficult to justify theimportance of the eye/brow area based on unique-ness of activation associated with this particularemotion. It also remains unclear why the eye/browarea is important for sadness compared to theother emotions even when similar activation isfound. Future studies are required in order toexamine the processing of this emotion.

The pattern for disgust is intriguing as the twoexperiments provide different patterns of results.For instance, in Experiment 1, similar to happi-ness, the mouth area was both sufficient andnecessary for the recognition of disgust. However,Experiment 2 showed that the mouth did notattract more attention. In fact, participants tookmore time before they gazed at the mouth area fordisgust compared to the other emotions. Further-more, they did not spend more time looking at themouth than the eyes/brows or in the mouth area

BEAUDRY ET AL.

428 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

for disgust than for the other emotions. Based onExperiment 1, the mouth would be the importantfeature in the identification of disgust because itwas shown to be both sufficient and necessary. Interms of muscle activation, the mouth area com-prised the upper lip raiser, which could have beenused as a cue to infer disgust. As for theactivations observed in the eye/brow area forsadness, the upper lip raiser is not unique todisgust as it was also shared with anger. However,disgust expressions do contain a unique cue, thenose wrinkle. Consequently, in Experiment 2,when all information is available, participantsmight have focused their attention more on thisunique cue. Unfortunately, eye-movement record-ings were only explored for the mouth and eye/brow areas, thus not providing an explicit exam-ination of the nose. Nevertheless, results ofExperiment 1, provide some insight on thisquestion by showing that when the mouth isremoved (while the nose is still accessible as acue), accuracy is still diminished showing thenecessity of the mouth even in the presence ofthe nose. In sum, the two experiment show thatthe identification of disgust might be explained bythe use of more than one feature.

Given these results, one might think that therecognition of disgust relies on a holistic proces-sing. In Experiment 1, both the presentation of asingle feature or its removal prevents a holisticprocessing (Calder et al., 2000; Calvo & Num-menmaa, 2008) because relationships betweenfeatures are disrupted (Calder & Jansen, 2005;Maurer et al., 2002). Thus, if emotional facialexpression recognition relies on a holistic proces-sing, this would have been reflected not only onthe insufficiency of the mouth and eyes/brows butalso shown in the necessity of both of these areas.The results for disgust do satisfy these criteriaeither since the mouth was sufficient and necessarybut the same could not be said for the eyes/brows,making the holistic account unsatisfactory.

The same reasoning with regard to a holisticprocessing can be applied to anger and surprise.While Experiment 1 showed that neither themouth nor eye/brow area was sufficient to recog-nise surprise, neither of the two areas was

necessary for its recognition. For anger, neitherof the areas was sufficient. Furthermore, while theeye/brow area was necessary for its recognition, themouth was not. Thus, the recognition processcould not be explained by the holistic account.Nevertheless, in terms of the featural account,neither of the two areas explored were clearly theimportant factor for their recognition and eyemovements also support this claim since theseareas did not attract attention more quickly andmore time was not spent in these areas. Fear is theonly emotion for which a holistic account could bea plausible explanation. In fact, results revealedthat both the mouth and eye/brow areas werenecessary for its recognition, while not beingsufficient. In other words, the combination of theeyes/brows and the mouth seem to be moreimportant for the recognition of this emotion.

Interestingly, for all emotions in the currentstudy, including sadness and happiness, appear-ance changes are observed in both the eye/browand mouth areas. However, how these areas areprocessed and used differs as a function of theemotion. It seems that the nature of the changesin appearance influences the processing that isundertaken. For instance, muscle actions producedin the eye/brow area for sadness are the inner browraiser and the brow lowerer (AUs 1 and 4). Fearalso includes these same two muscle actions butwith the addition of the outer brow raiser (AU 2),while anger includes the activation of the browlowerer (AU 4) without the inner brow raiser (AU1). The activation of additional muscles influencethe overall configuration of the brows and thus adifferent appearance change is produced in theeyes/brows area for each of the three emotions.Even if the action units involved are the same andin the same area, in seems that they qualitativelyaffect the processing being completed. In addition,the changes that are produced in the mouth areain combination with those in the eyes/brows alsoimpact the type of processing that is being done.

Finally, for all emotions, participants spentmore time looking at the eyes/brows than themouth areas regardless of the emotion, a patternthat is consistent in the literature (see Sullivanet al., 2007). However, it is interesting to observe

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 429

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

that when participants look at the eyes/browsfaster, it is associated with higher rates of inaccur-ate response, while when they look at the mouthmore quickly, it is associated with higher recogni-tion rates. This result also adds to the complexityof the recognition process of emotional facialexpressions.

In sum, data reported here contribute to ourunderstanding of the recognition of the six basicemotions and shed light on how facial movementsseem to influence decoding mechanisms. Thesedata also lead us to explore other possible explana-tions to account for the recognition of emotionalfacial expression since this processing cannot beexplained simply by calling upon featural orholistic accounts for all emotions. For instance,Ellison and Massaro (1997) suggested that pro-cessing of emotional facial expressions is neither asimple feature extraction nor a holistic process, butthat information is simultaneously extracted fromvarious features to be integrated in order torecognise the emotion. However, from eye-track-ing studies, including the current one, scan pathsshow that participants move their eyes from oneregion to another and explore multiple areas beforeproducing their response. Even if participants lookmore rapidly at a region or spend more time inone region than in another, for all prototypes theyexplore more than a single feature. Consequently,we hypothesise that features are processed sequen-tially to then be combined to extract signification.Furthermore, a fixation in one region couldinfluence the location of a subsequent fixation.The results from previous and present studiesguide us in the examination of hypotheticalmodels of the mechanisms of perceptual proces-sing of emotional facial expressions.

Limitations

The sample size could be viewed as a potentiallimitation of the current study. While experimentsin emotional facial expression recognition tend tohave larger samples, the current sample size istypical for eye-movement research. For example,Sullivan et al. (2007) studied groups of 30participants, which is similar to our sample.

Nevertheless, future research should consider lar-

ger samples. From a more theoretical perspective,

a limitation of the current study was that only the

mouth and eye/brow areas were explored. This

choice was made based the fact that both these

regions contain emotional information for all six

basic emotions. It would be particularly important

to examine other areas of the face since some

emotions contain unique cues in other areas (e.g.,

disgust).

Conclusion

The present study aimed to clarify the role played

by the eye/brow and mouth areas in the recogni-

tion of the six basic emotions. Results revealed

that the mouth was an important feature for the

recognition of happiness and the eyes/brows for

sadness. Furthermore, fear was the only emotion

for which a holistic processing could be called

upon. For the other emotions, results are neither

supported by a featural or a holistic account.

Future research should explore other processing

mechanisms.

Manuscript received 18 June 2012

Revised manuscript received 19 July 2013

Manuscript accepted 6 August 2013

First published online 17 September 2013

REFERENCES

Baron-Cohen, S., Wheelwright, S., & Jolliffe, T.(1997). Is there a “language of the eyes”? Evidencefrom normal adults, and adults with autism orAsperger syndrome. Visual Cognition, 4, 311–331.doi:10.1080/713756761

Boraston, Z. L., Corden, B., Miles, L. K., Skuse, D.H., & Blakemore, S.-J. (2008). Brief report: Percep-tion of genuine and posed smiles by individuals withautism. Journal of Autism and Developmental Dis-

orders, 38, 574–580. doi:10.1007/s10803-007-0421-1

Calder, A. J., & Jansen, J. (2005). Configural coding offacial expressions: The impact of inversion andphotographic negative. Visual Cognition, 12, 495–518. doi:10.1080/13506280444000418

BEAUDRY ET AL.

430 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

Calder, A. J., Young, A. W., Keane, J., & Dean, M.(2000). Configural information in facial expressionperception. Journal of Experimental Psychology:

Human Perception and Performance, 26, 527–551.doi:10.1037/0096-1523.26.2.527

Calvo, M. G., & Marrero, H. (2009).Visual search ofemotional faces: The role of affective content andfeatural distinctiveness. Cognition and Emotion, 23,782–806. doi:10.1080/02699930802151654

Calvo, M. G., & Nummenmaa, L. (2008). Detection ofemotional faces: Salient physical features guideeffective visual search. Journal of Experimental Psy-chology: General, 137, 471–494. doi:10.1037/a0012771

Calvo, M. G., Nummenmaa, L., & Avero, P. (2010).Recognition advantage of happy faces in extrafovealvision: Featural and affective processing. Visual

Cognition, 18, 1274–1297. doi:10.1080/13506285.2010.481867

Derntl, B., Seidel, E.-M., Kainz, E., & Carbon, C.-C.(2009). Recognition of emotional expressions isaffected by inversion and presentation time. Percep-tion, 38, 1849–1862. doi:10.1068/p6448

Eisenbarth, H., & Alpers, G. W. (2011). Happy mouthand sad eyes: Scanning emotional facial expressions.Emotion, 11, 860–865. doi:10.1037/a0022758

Ekman, P. (2003). Emotions revealed: Recognizing faces

and feelings to improve communication and emotional

life. New York, NY: Times Books.Ekman, P., & Friesen, W. V. (1969). Nonverbal

leakage and clues to deception. Psychiatry: Journal

for the Study of Interpersonal Processes, 32, 88–106.Ekman, P., & Friesen, W. V. (1971). Constants across

cultures in the face and emotion. Journal of Person-ality and Social Psychology, 17, 124–129. doi:10.1037/h0030377

Ekman, P., & Friesen, W. V. (1978). Facial Action

Coding System: Investigator’s guide. Palo Alto, CA:Consulting Psychologists Press.

Ekman, P., & Friesen, W. V. (1986). A new pan-cultural facial expression of emotion. Motivation and

Emotion, 10, 159–168. doi:10.1007/BF00992253Ekman, P., Friesen, W. V., & O’Sullivan, M. (1988).

Smiles when lying. Journal of Personality and Social

Psychology, 54, 414–420. doi:10.1037/0022-3514.54.3.414

Ekman, P., O’Sullivan, M., & Frank, M. G. (1999). Afew can catch a liar. Psychological Science, 10, 263–266. doi:10.1111/1467-9280.00147

Ellison, J. W., & Massaro, D. W. (1997). Featuralevaluation, integration, and judgement of facial

affect. Journal of Experimental Psychology: Human

Perception and Performance, 23, 213–226.doi:10.1037/0096-1523.23.1.213

Gosselin, P., & Kirouac, G. (1995). Le décodage deprototypes émotionnels faciaux. Revue Canadienne dePsychologie Expérimentale, 49, 313–329. doi:10.1037/1196-1961.49.3.313

Gosselin, P., & Pélissier, D. (1996). Effet de l’intensitésur la catégorisation des prototypes émotionnelsfaciaux chez l’enfant et l’adulte. International Journalof Psychology, 31, 225–234. doi:10.1080/002075996401007

Horley, K., Williams, L. M., Gonsalvez, C., & Gordon,E. (2004). Face to face: Visual scanpath evidence forabnormal processing of facial expressions in socialphobia. Psychiatry Research, 127, 43–53. doi:10.1016/j.psychres.2004.02.016

Izard, C. E. (1971). The face of emotion. New York, NY:Appleton-Century-Crofts.

Izard, C. E. (1994). Innate and universal facial expres-sions: Evidence from developmental and cross-cul-tural research. Psychological Bulletin, 115, 288–299.doi:10.1037/0033-2909.115.2.288

Jack, R. E., Blais, C., Scheepers, C., Schyns, P. G., &Caldara, R. (2009). Cultural confusions show thatfacial expressions are not universal. Current Biology,19, 1543–1548. doi:10.1016/j.cub.2009.07.051

Leppänen, J. M., & Hietanen, J. K. (2007). Is theremore in a happy face than just a big smile? Visual

Cognition, 15, 468–490. doi:10.1080/13506280600765333

Martin, D., Slessor, G., Allen, R., Phillips, L. H., &Darling, S. (2012). Processing orientation and emo-tion recognition. Emotion, 12, 39–43. doi:10.1037/a0024775

Matsumoto, D., & Ekman, P. (1988). Japanese and

Caucasian Facial Expressions of Emotion (JACFEE)

[Slides]. San Francisco, CA: Intercultural and Emo-tion Research Laboratory, Department of Psycho-logy, San Francisco State University.

Maurer, D., Le Grand, R., & Mondloch, C. J. (2002).The many faces of configural processing. Trends inCognitive Sciences, 6, 255–260. doi:10.1016/S1364-6613(02)01903-4

Murphy, N. A., & Isaacowitz, D. M. (2010). Ageeffects and gaze patterns in recognizing emotionalexpressions: An in-depth look at gaze measures andcovariates. Cognition and Emotion, 24, 436–452.doi:10.1080/02699930802664623

Nusseck, M., Cunningham, D. W., Wallraven, C., &Bülthoff, H. H. (2008). The contribution of

FEATURAL PROCESSING

COGNITION AND EMOTION, 2014, 28 (3) 431

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014

different facial regions to the recognition of conver-sational expressions. Journal of Vision, 8, 1–23.doi:10.1167/8.8.1

Porter, S., & ten Brinke, L. (2008). Reading betweenthe lies: Identifying concealed and falsified emotionsin universal facial expressions. Psychological Science, 19,508–514. doi:10.1111/j.1467-9280.2008.02116.x

Prkachin, G. C. (2003). The effect of orientation ondetection and identification of facial expressions ofemotion. British Journal of Psychology, 94, 45–62.doi:10.1348/000712603762842093

Russell, J. A. (1993). Forced-choice response format inthe study of facial expression. Motivation and Emo-

tion, 17, 41–51. doi:10.1007/BF00995206Russell, J. A., & Fernadez-Dols, J. M. (1997). The

psychology of facial expression. Paris: Cambridge Uni-versity Press.

Smith, M. L., Cottrell, G. W., Gosselin, F., & Schyns,P. G. (2005). Transmitting and decoding facialexpressions. Psychological Science, 16, 184–189.doi:10.1111/j.0956-7976.2005.00801.x

Sullivan, S., Ruffman, T., & Hutton, S. B. (2007). Agedifferences in emotion recognition skills and visualscanning of emotion faces. The Journals of Geronto-logy: Series B: Psychological Sciences and Social Sciences,62, 53–60. doi:10.1093/geronb/62.1.P53

Tracy, J. L., & Robins, R. W. (2008). The automaticityof emotions recognition. Emotion, 8, 81–95.doi:10.1037/1528-3542.8.1.81

Vassallo, S., Cooper, S. L., & Douglas, J. M. (2009).Visual scanning in the recognition of facial affect: Isthere an observer sex difference? Journal of Vision, 9,1–10. doi:10.1167/9.3.11

Widen, S. C., & Russell, J. A. (2003). A closer look atpreschoolers’ freely produced labels for facial expres-sions. Developmental Psychology, 39, 114–128.doi:10.1037/0012-1649.39.1.114

Widen, S. C., & Russell, J. A. (2008). Children acquireemotion categories gradually. Cognitive Development,23, 291–312. doi:10.1016/j.cogdev.2008.01.002

Wiggers, M. (1982). Judgments of facial expressions ofemotion predicted from facial behaviour. Journal ofNonverbal Behavior, 7, 101–116. doi:10.1007/BF00986872

Williams, L. M., Loughland, C. M., Gordon, E., &Davidson, D. (1999). Visual scanpaths in schizo-phrenia: Is there a deficit in face recognition?Schizophrenia Research, 40, 189–199. doi:10.1016/S0920-9964(99)00056-0

Wong, B., Cronin-Golomb, A., & Neargarder, S.(2005). Patterns of visual scanning as predictors ofemotion identification in normal aging. Neuropsy-

chology, 19, 739–749. doi:10.1037/0894-4105.19.6.739

BEAUDRY ET AL.

432 COGNITION AND EMOTION, 2014, 28 (3)

Dow

nloa

ded

by [

Uni

vers

ity o

f T

enne

ssee

, Kno

xvill

e] a

t 19:

39 1

9 D

ecem

ber

2014