53
Running head: A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis 1 A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis: Focusing on the Emojis of “K” Company Jeon Hye-Jin Gumi University Author Note Jeon Hye-Jin, Assistant Professor, School of Visual Game Contents, Gumi University, 37, Yaeun-ro, Gumi-si, Gyeongsangbuk-do, 39213, South Korea e-mail: [email protected] Acknowledgements: not applicable Funding: not applicable Conflicts of Interest: The Author declare that there is no conflict of interest

이모티콘의 비언어표현에 따른 공감 및 관계몰입의 차이 연구 › pstorage... · Web viewRunning head: SHORT TITLE1. A Study on Difference in Empathy and Relation

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

이모티콘의 비언어표현에 따른 공감 및 관계몰입의 차이 연구

Running head: SHORT TITLE1

A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis34

A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis: Focusing on the Emojis of “K” Company

Jeon Hye-Jin

Gumi University

Author Note

Jeon Hye-Jin, Assistant Professor, School of Visual Game Contents, Gumi University,

37, Yaeun-ro, Gumi-si, Gyeongsangbuk-do, 39213, South Korea

e-mail: [email protected]

Acknowledgements: not applicable

Funding: not applicable

Conflicts of Interest: The Author declare that there is no conflict of interest

A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis: Focusing on the Emojis of “K” Company

Abstract

To analyze the difference in empathy and relation commitment according to nonverbal expressions of emojis, this study divided nonverbal expressions into four types according to presence/absence of movement and contextual information based on static body language. The effects of emojis were analyzed by subdividing them into emotional types (positive, negative) and degree of body expression (upper body, whole body). The results of the study showed that the combined type of movement and contextual information as well as the movement type had a positive (+) effect on empathy and relation commitment regardless of emotional types and degree of bodily expression. The contextual information type had a positive effect on empathy only in the upper body, but had a positive (+) effect on relation commitment in all of positive emotion, upper body, and whole body, except for negative emotions. In addition, the nonverbal expression type with the highest empathy and relation commitment was the combined type of both movement and contextual information, but empathy showed a large difference between emotional types, while relation commitment had little difference between emotional types and degree of body expression.

Keywords: Emoji, empathy, relation commitment, nonverbal expression, CMC

A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis: Focusing on the Emojis of “K” Company

In present CMC[endnoteRef:1], emojis act as an element that not only maximizes communication effect but also strengthens human relations because they can express a person’s emotional interiority. Watzlawick, Bavelas, and Jackson (1967) stated that language messages constitute the content of communication while emojis are involved in forming relationships between communicators. According to Quintly (2017)[endnoteRef:2], the use of emojis increased interaction between communicators by an average of 47.7% compared to when they are not used. Regarding this phenomenon, Quintly explained that experts use emojis in order to make them look friendlier, and enterprises use emojis strategically to make their images more familiar and intimate to consumers through storytelling. This shows that emojis in the present are used not only by individuals but also by enterprises as an important means of relationship marketing, not just as an interesting decoration or a transient trend. In South Korea, this phenomenon is more evident in mobile messengers where two-way interactions occur in real time than in social media. Therefore, when designing emojis, it is necessary to consider the emoji not as an element that simply assists in enforcing the content of the message, but as a symbol directly constructing the relationship between communicators and functioning as a medium that induces a relation commitment beyond empathy. For this purpose, this study intends to identify nonverbal expressions that are effective in relation commitment through the difference in empathy and relation commitment according to nonverbal expressions of emojis. [1: In this study, computer-mediated communication (CMC) refers to text-based interaction that is performed or promoted through computer-based digital technologies such as cell phones, tablets, and PCs (Spitzberg, 2006).] [2: New Instagram Emoji Study H1 (2017), https://www.quintly.com/blog/instagram-emoji-study]

Theoretical Background

Relationship between Empathy and Relation Commitment through Emojis

In current CMC, emojis are regarded as nonverbal surrogates (Walther, 2006). This non-verbal expression is generally reported to induce emotional expression more effectively than language.[endnoteRef:3] In other words, because emojis are more emotionally recognizable than texts in CMC, they play the role of helping communicators actively respond to other party, therefore continuously promoting the two-way communication. As a result, communicators can empathize with each other and form emotional intimacy to facilitate a bond between them, and furthermore, they can reach a relation commitment. When this process is examined using the Hierarchy of Effects Model[endnoteRef:4], the empathy and relation commitment through emojis have a causal relationship of cognitive-emotional-behavioral responses that occur in the consciousness flow of the recipients when interacting through the medium of emojis. Here, the empathy is a cognitive-emotional response as there is an ability to not only understand others’ emotional states and experiences, but also share them and respond to expressions appropriately (Blair, 2005). In addition, the relation commitment is an emotional-behavioral response as a desire and attitude to maintain relationship with a particular object after feeling emotional attachment to it (Jaros, Jermier, Koehler, & Sincich, 1993). In particular, it belongs to an affective commitment among various types of commitments. [3: It was proven that nonverbal expressions induce emotional expressions more effectively because a nonverbal survey tool (PrEmo) showed much higher scores than language during the user satisfaction survey (Shin & Eune, 2017).] [4: The Hierarchy of Effects Model explains that the process of communication effect is sequential and hierarchical. In other words, Lavidge and Steiner (1961) categorized the process leading to the purchase of products into as the sequential stages of cognition→ knowledge → favorability → preference → assurance → action when consumers are exposed to persuasive advertising communication. Here, the cognition and knowledge stages are where consumers recognize something based on their knowledge, and these stages are accompanied by thinking processes. The favorability and preference stages are emotional process of consumers in which they experience in their thinking processes. The last stages, assurance and action, are behavioral responses based on such emotional processes. ]

As such, the empathy and relation commitment through emojis is a causal result of the recipients’ process of recognizing the emojis, interpreting the meaning of the emojis, and experiencing them. Therefore, the forms of emoji expression that influence the process are important. This study will discuss the expressive factors of emojis that influence empathy and relation commitment with a focus on body languages based on the emojis’ fundamental property as emotional, nonverbal expressions.

Signification of Body Languages in F2F (Face to Face)

Within the category of nonverbal expressions, a key element of communication is body language. In general, body language is regarded as encompassing hand gestures and facial expressions, as well as a domain of attitude and movement that can convey emotions and intents through the body (Xinhua, 2000). In other words, body language conveys the emotional state and meaning of action through bodily expressions such as specific posture and behavior, so movement is an important factor in the signifying process of body language. In this respect, Knapp (1978) described body language as a nonverbal message by physical movements.

In addition, body language does not occur independently, in isolation, but occurs in a specific context[endnoteRef:5] of communication (de Gelder et al., 2006). In other words, body language, like the signifying process of language, is primarily embodied through the body and physical movements instead of language in the “spoken space”[endnoteRef:6], and is also signified by constituting contexts by time. Thus, even if body language may convey meaning through the body's motor system, if it functions as a social interaction that conveys emotions and attitudes, the meaning of body language should be considered in conjunction with context, like spoken language (Yin, 2014). [5: The word “context” originated from the Latin word “contexere”, which means "entangled together" and "composed together". In this sense, context is defined as an interrelated relationship that gives meaning to the parts that make up the whole (Cole & Cole, 1989). Therefore, context performs an essential function of signification process as an element that identifies and completes the overall meaning of communication (Suh, 2015).] [6: Desclés (1976) explained that the texts uttered by the speaker are primarily embodied in the 'spoken space' by the linguistic symbol and constitute the contexts by time.]

Consequently, emotional perception based on body language is based on movement, but is also always influenced by contextual variables (Wieser & Brosch, 2012). In particular, contextual information in emotional perception was reported to improve facial recognition because it can reduce confusion (Susskind, Littlewort, Bartlett, Movellan, & Anderson, 2007) between emotional types with high physical similarities (Gnepp, 1983).

Relationship between Emotion and Movement

According to Laban, human movement is a symbol of the mind. In other words, human movement is an external representation (Redfern, 1965) of the underlying internal action that causes movement (Redfern, 1965), and it is a clue to distinguish emotional types because their characteristics appear as the characteristics of movement.[endnoteRef:7] According to previous studies, emotional types are related to speed (Pollick, Paterson, Bruderlin, & Sanford, 2001), openness, and softness (Boone & Cunningham, 2001) among the characteristics of movement. In particular, Lee (2007)[endnoteRef:8] proposed the relationship between emotion and movement through the two axes of the framework : valence (pleasant/unpleasant) and arousal (activation/ deactivation), in emotional space of Russell's Circumplex Model. According to this framework, pleasant emotions are characterized by soft, open, fast movements, and unpleasant emotions are characterized by slow movements of a jerky and closed type. [7: According to Laban’s Movement Analysis, when energy is expressed to an external space according to internal effort, the characteristics and shape of the movement appear through the body. Thus, the characteristics of movement are distinguished according to emotional types and inner attitude.] [8: Lee (2007) analyzed the related studies and conduced field survey in order to systematize the relationship between emotions and movements, and reported a correlation between the axes of emotional space and three characteristics of physical movement (speed, openness, and softness). ]

The characteristics of movements with these different activities are based on emotional recognition (Pollick et al., 2001; Wallbott, 1998). In particular, it was reported that emotional arousal is more effectively perceived than valence (Clavel, Plessier, Martin, Ach, & Morel, 2009; Karg, Kühnlenz, & Buss, 2010).

Types of Nonverbal Expressions of Emojis in CMC

According to the analysis result of emojis as emotional expressions, the emojis express complicated emotions in the form of movements (motion) and contextual information[endnoteRef:9] (backgrounds, etc.) added to the body language such as facial expression (10%) and gesture (90%) (Shin & Eune, 2017). In other words, in reality, body language appears in continuous and complicated form where movements and contextual information are combined, but in CMC, emojis have media characteristics such as temporal and spatial constraints and information transmission and reception. Therefore, emojis are not only stereotyped according to movement and contextual information based on static body language, but also segmentalized according to various emotional types (e.g., joy, anger, sadness, and happiness) and degree of bodily expressions (e.g., face type, upper body type, and whole body type). [9: In this study, contextual information refers to information related to the emotional, physical, and temporal and spatial states of the sender expressed in an emoji representing the emotions of the sender as an entity based on the definition of context of Dey, Abowd, and Salber (2001).]

[insert Figure 1 near here]

In particular, as an expressive element of emojis, movement and contextual information are important factors in empathy and relation commitment, because they play the role of enhancing the narrative and real sense in emotional as well as cognitive aspects of F2F.[endnoteRef:10] In addition, movement and contextual information are reported to act as a mechanism to create empathy in the process of mind-reading about affective expression of others.[endnoteRef:11] Body language in F2F has a difference in the recognition rate and accuracy of emotions because the range of exposure and the degree of accumulation of emotional activity and physiological activity change according to the range of bodily expressions. Therefore, it is necessary to analyze whether the influence of each nonverbal expression type changes according to emotional types and degree of bodily expression also in empathy and relation commitment through emojis in CMC. [10: Movement and contextual information not only increase the activation of senses but also make description of specific situations possible. Therefore, it enhances empathy and interaction by enriching real senses and narrative. Regarding this, Mattila (2000) argued that the narrative in advertisements forms a favorable advertising attitude by helping empathy formation and immersion of consumers. ] [11: Walter (2012) examined previous studies on empathy and suggested Theory of Mind (ToM) and brain circuits related to empathy. According to Walter's brain diagram, cognitive ToM occurs in the high road of the brain, and affective empathy occurs in the low road of the brain. Here, the mechanism influencing the top-down information-processing is the semantic information (e.g., contextual information) about the affective state of others, and the mechanism influencing bottom-up information processing is identified as suggestive infectious signals (e.g., movement) (Frith & Frith, 2008).]

Research Questions

Research question 1 (RQ1): Is there difference in empathy and relation commitment according to nonverbal expression type?

Research question 2 (RQ2): Is there difference in empathy and relation commitment according to nonverbal expression type and emotional type?

Research question 3 (RQ3): Is there difference in empathy and relation commitment according to nonverbal expression type and degree of bodily expressions?

Measurement Tools

Nonverbal Expressions of Emojis

In this study, we set the core expressive elements of emojis for empathy and relation commitment in CMC as body language, movement, and contextual information based on theoretical research and previous studies. In order to identify the difference in empathy and relation commitment when movement and contextual information act individually and when they are combined, we set four types[endnoteRef:12] according to presence and absence of movement and contextual information based on static body language, referring to them as 'nonverbal expression types' of emojis. In addition, we segmentalized these nonverbal expression types into emotional types (positive, negative) and degree of bodily expression (upper body, whole body) thereby getting a total of 16 types, referring to them as ‘nonverbal expressions’ of emojis.[endnoteRef:13] [12: Movement (x) + Contextual information (x) = Static body language type (Type A), Movement (x) + Contextual information (o)= Contextual information type (Type B), Movement (o) + Contextual information (x)=Movement type (Type C), Movement (o)+ Contextual information (o)= Movement + Contextual information type (Type D)] [13: In CMC, emojis are surrogates of nonverbal expressions, but CMC's media characteristics of temporal and spatial constraints and information-transmission and -reception should be considered. Therefore, the emojis selected and transmitted by the sender are in the combined form of not only the emotional state of the sender but also the movement and contextual information, as well as the degree of bodily expression. Therefore, this study operationally defined the complex expressions of emojis, which express nonverbal expression types such as movement and contextual information, emotional types, and degree of bodily expression, as “nonverbal expressions”.]

Among the set emotional types of emojis, the positive and negative emotions are the most opposite emotions among basic emotions of humans (Rusell, 1983). Based on Rusell's Circumplex Model, the positive (happy) emotion was defined as pleasant-high activity and the negative (sad) emotion was defined as unpleasant-low activity. In addition, for the degree of bodily expression, the upper body was defined as the body part down to the upper side of the navel, including facial expression, and whole body was defined as both the upper body, including facial expression, and lower body. This reflected the fact that, due to the spatial nature of mobile messengers, most emojis are expressed in the combined form of facial expressions and gestures, rather than facial expression alone (Shin & Eune, 2017). This is based on the classification of Delsarte(Shawn, 1963) and Birdwhistell (1970),[endnoteRef:14] who classified the body language semiologically and functionally. [14: Birdwhistell (1970) classified body language into gaze/glance, facial expression, gesture (motion/behavior), and posture based on body units performing their functions through similarity to spoken language. On the other hand, Delsarte, who started to study kinesis in the first half of the 19th century, classified body language into head (mental, intellectual zone), trunk (emotional, moral, spiritual zone), and leg (vital, physical zone) (Shawn, 1963). This shows that body language can be divided into functional and semantic units, and bodily expressions can have complexity when the units of body language are connected. ]

Empathy

Empathy was measured using two questionnaire items of emotional empathy in the empathy scale of Mehrabian and Epstein (1972) and Davis (1980), which was based on the empathy scale of Escalas and Stern (2003), and Cronbach's α value was as high as 0.885.

Relation Commitment

Relation commitment was measured using two questionnaire items related to meaning of relation and attachment to relation among measurement items of affective commitment in Allen and Meyer (1990), and Cronbach's α value was as high as 0.848.

Test Design and Research Method

Test Subjects and Stimuli

The subjects of this study were university students who were using a mobile messenger application, Kakao Talk,[endnoteRef:15] in Korea. The stimuli were limited to the 'MUZI & CON' character emojis, which express both positive and negative emotions and are distributed free of charge, among the most popular Kakao Friends[endnoteRef:16] in Kakao Talk. In addition, the manipulated stimuli were applied together with the UI of the chat window of Kakao Talk so that the subjects could assume that they received emojis from their friends in Kakao Talk. The questionnaire[endnoteRef:17] was designed to be implemented in PC and mobile in order to present a media environment similar to the emoji motion reproduction and emoji reception situation. [15: The summarized report on Internet use in 2017 showed that 99.4% of Koreans are using Kakao Talk.] [16: Kakao Friends showed the highest purchase rate in the questionnaire survey on preferences of domestic smartphone messenger emoji characters (Kim, 2015).] [17: https://platform.post-survey.com/preview/previewPageAll.php?key=Pzw0Krbl]

Data Collection and Analysis Method

The data were first collected for a total of seven days, from February 28th to March 5th, 2019, from 520 persons who passed the unfairness test among those who participated in the questionnaire through PC and mobile, and then the data of 514 participants, except for six who had no experience using emojis, were analyzed. Table 1 below shows the demographic characteristics of the test subjects. Variance analysis and t-test were performed to verify the difference in variables according the nonverbal expressions of emojis, and STATA 14 was used for data analysis and processing.

[insert Table 1 near here]

Analysis Result

RQ1

The result of t-test showed that the differences between Type A-C, Type A-D, and Type C-D were significant, but not so between Type A-B both in empathy and relation commitment (See Tables 2 and 3). In addition, the result of two-way ANOVA analysis showed that both the empathy and relation commitment showed Type C > Type A and Type D > Type C in positive and negative emotional types, upper body, and whole body. In other words, the movement and movement + contextual information had a positive (+) influence on empathy and relation commitment regardless of emotional types and degree of bodily expression, and movement + contextual information had a more positive (+) effect than when each factor acted separately. However, the contextual information without movement showed Type B > Type A only in upper body in empathy, and in all except negative emotion in relation commitment.

[insert Table 2 near here]

[insert Table 3 near here]

This indicates that the contextual information plays the positive (+) role in the process of recognizing and understanding emotions, but in the previous process of perceiving information[endnoteRef:18], it produces excessive information and has a positive (+) effect only in upper body(Volkova, Mohler, Dodds, Tesch, & Bülthoff, 2011)[endnoteRef:19] in empathy in terms of cognition-available information. [18: Comesaña et al. (2013), Jolij and Lamme, 2005), and White (1995) explained that the affective evaluation of emojis is an automatic process. In addition, O'Neill (2013) reported that he found the mirroring phenomenon as evidence of mimicry of facial expressions. If we regard the cognition process of the emojis as a mind-reading about nonverbal expression like in F2F based on this, then the cognitive process of emojis has higher correlation with the mirror neuron-based simulation process than the mentalizing process. In particular, unlike mentalizing, mirror neuron-based mirroring is a simple and automatic process that occurs subconsciously on the perception level. Therefore, although the mirroring of emotions may be contextualized and may lead to rich emotional knowledge, the contextual information can disturb or interfere with mirroring itself in the process of perception, which is verified by the Stroop effect (de Vignemont, 2009).] [19: According to previous studies, it is possible to recognize emotions only by movement of the upper body, not the whole body (Volkova, Mohler, Dodds, Tesch, & Bülthoff, 2011, 2014), and the movement of upper body may be more efficient in emotional recognition in terms of cognition-available information (Glowinski et al., 2011).]

These results show that, even though empathy is the result of integrated cognitive and affective process (Davis, 1996), empathy through emojis comprises a cognitive-affective process with a strong influence on cognitive availability. On the other hand, the contextual information had a positive (+) effect on all, except for negative emotions with unpleasant valence, in relation commitment. This is an emotional-behavioral response, because it plays the role of enhancing the narrative and real sense in emotional responses such as emotional move-in.

RQ2

The results of two-way ANOVA showed that both the difference in empathy (p<.05) and the difference in relation commitment (p<.1) according the nonverbal emotional type were significant only in negative emotion (refer to Tables 4 and 5). This is because the influence of the nonverbal expression type of the emojis, which directly influences the emotional perception and affective processes, was greater in the negative emotions with low emotional activity in empathy and in the negative emotions, which are the anti-social emotions in relation commitment.

[insert Table 4 near here]

[insert Table 5 near here]

In addition, empathy and relation commitment according to nonverbal expression types and emotional types showed “positive emotions > negative emotions” in all types from Type A to Type D (see Figure 2). In particular, empathy is strongly influenced by cognitive processes. Thus, the gap between positive and negative emotional types according to emotional activity is maintained in all nonverbal expression types in emotional perception, while the gap was the most prominent in Type B in relation commitment.

[insert Figure 2 near here]

In other words, relation commitment, as a psychological attachment to a relationship, is influenced by emotional valence (pleasant, unpleasant) [endnoteRef:20] because it is an emotional-behavioral response. As contextual information increases pro-social attitude in positive emotions and antisocial attitude in negative emotions, the gap between emotional types was most prominent in type B. However, in relation commitment, the real sense and the narrative are more influential than the emotional valence, because the gap between emotional types is reduced when the real sense and narratives are enriched by the nonverbal expressions like Type C and Type D. [20: Imitation of the behavior of the other party, as friendly attitude to relationship, is bidirectional; in particular, a higher imitation effect was produced in pro-social priming than in anti-social priming (Leighton, Bird, Orsini, & Heyes, 2010).]

RQ3

The results of two-way ANOVA analysis showed that the difference in empathy for each degree of bodily expression according to nonverbal expression types was significant (p <.05) both in the upper body and the whole body, but the difference in relation commitment was significant (p <.1) only in the upper body (See Tables 6 and 7). This indicates that the degree of bodily expression of the emojis influences the range of physiological activity and emotional activity. Thus, in empathy with a large influence on cognition availability and emotion recognition rate, the difference was significant both in the upper body and whole body, while in relation commitment, which is an emotional-behavioral process, the difference according to nonverbal expression types was significant only in the upper body (with lower degree of bodily expression), because the whole body (with high degree of bodily expression) had a high identification rate and emotional move-in rate. In addition, the empathy and relation commitment according to nonverbal expression types and degree of bodily expression showed “whole body > upper body” in all nonverbal expression types, except Type B (interaction effect[endnoteRef:21]) in empathy (See Figure 3). In particular, the difference in empathy and relation commitment between the upper body and the whole body was the largest in Type A. However, the difference in empathy was insignificant in all nonverbal expression types except Type A, while the difference in relation commitment between the upper body and the whole body gradually decreased when the type moves from Type A to Type D, in other words, when the real sense or narratives are enriched by the nonverbal expression types. In particular, the difference was the smallest in Type D. [21: When contextual information was added to the static body language, empathy showed “upper body > whole body” in Type B, because it received a positive (+) influence in the upper body due to cognitive availability and a negative (-) influence in the whole body.]

[insert Table 6 near here]

[insert Table 7 near here]

[insert Figure 3 near here]

Conclusion

The analysis result of this study shows that the nonverbal expression type with the highest empathy and relation commitment through emojis is the combined type of movement and contextual information. In other words, the most influential factor in empathy and relation commitment is movement, but when movement is combined with contextual information, the influence of movement is further strengthened by contextual information. The contextual information without movement mostly shows a negative (-) effect in empathy and a positive (+) effect in relation commitment, but these are not significant.

When we look at the detailed characteristics of the variables according to nonverbal expressions, the difference between the emotional types is large in all nonverbal expression types in empathy, but the difference between the degrees of bodily expression is insignificant in all types except Type A. On the other hand, in relation commitment, the difference between emotional types and difference between degrees of bodily expression becomes insignificant when real sense and narratives are enriched, as in Type C and Type D. These results suggest that both empathy and relation commitment are based on emotional processes that are influenced by movement and contextual information, but empathy is the combined result (Davis, 1996) of cognitive and emotional processes, with large differences between emotional types according to emotional activity, particularly in terms of emotional cognition, which is reconfirmed in CMC emojis. On the other hand, it was verified that relation commitment is an emotional-behavioral response with more prominent influence of movement and contextual information directly related to real senses and narratives than differences between emotional types or between degrees of bodily expression. In particular, it was verified that, in relation commitment, expression of upper body alone can produce the effect of whole body if movement and contextual information are combined and real senses and narratives are enriched. The results of this study are expected to serve as a practical guide for the design of emojis and as the basis of future research on relation building using media such as emojis.

Notes

Running head: A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis1

A Study on Difference in Empathy and Relation Commitment According to Nonverbal Expressions of Emojis18

References

Allen, N. J., & Meyer, J. P. (1990). The measurement and antecedents of affective, continuance and normative commitment to the organization. Journal of Occupational Psychology, 63(1), 1–18.

Birdwhistell, R. L. (1970). Kinesics and context. Philadelphia, PA: University of Pennsylvania Press.

Blair, R. J. R. (2005). Responding to the emotions of others: Dissociating forms of empathy through the study of typical and psychiatric populations. Consciousness and Cognition, 14, 698–718.

Boone, R. T., & Cunningham, J. G. (2001). Children’s expression of emotional meaning in music through expressive body movement. Journal of Nonverbal Behavior, 25(1), 21–41.

Clavel, C., Plessier, J., Martin, J.-C., Ach, L., & Morel, B. (2009). Combining facial and postural expressions of emotions in a virtual character. In Z. Ruttkay, M. Kipp, A. Nijholt, & H. H. Vilhjálmsson (Eds.), Paper presented at the Intelligent Virtual Agents (pp. 287–300). Berlin, Heidelberg: Springer.

Cole, M., & Cole, S. R. (1989). The development of children. New York, NY: Scientific American Books.

Comesaña, M., Soares, A. P., Perea, M., Piñeiro, A. P., Fraga, I., & Pinheiro, A. (2013). ERP correlates of masked affective priming with emoticons. Computers in Human Behavior, 29, 588–595.

Davis, M. H. (1980). A multidimensional approach to individual differences in empathy. JSAS Catalog of Selected Documents in Psychology, 10, 85.

Davis, M. H. (1996). Empathy: A social psychological approach. New York, NY: Routledge.

de Gelder, B., Meeren, H. K., Righart, R., van den Stock, J., van de Riet, W. A., & Tamietto, M. (2006). Beyond the face: Exploring rapid influences of context on face processing. Progress in Brain Research, 155, 37–48.

de Vignemont, F. (2009). Drawing the boundary between low-level and high-level mindreading. Philosophical Studies, 144(3), 457–466.

Desclés, J.-P. (1976). Quelques opérations énonciatives. In J. David (Ed.), Logique et niveaux d’analyse linguistique (pp. 213-242). Paris: Klinckseieck.

Dey, A. K., Abowd, G. D., & Salber, D. (2001). A conceptual framework and a toolkit for supporting the rapid prototyping of context-aware applications. Human–Computer Interaction, 16(2-4), 97–166.

Escalas, J. E., & Stern, B. B. (2003). Sympathy and empathy: Emotional responses to advertising dramas. Journal of Consumer Research, 29, 566–578.

Frith, C. D., & Frith, U. (2006). The neural basis of mentalizing. Neuron, 50, 531-534.

Glowinski, D., Dael, N., Camurri, A., Volpe, G., Mortillaro, M., & Scherer, K. (2011). Toward a minimal representation of affective gestures. IEEE Transactions on Affective Computing, 2(2), 106–118.

Gnepp, J. (1983). Children's social sensitivity: Inferring emotions from conflicting cues. Developmental Psychology, 19, 805–814.

Jaros, S. J., Jermier, J. M., Koehler, J. W., & Sincich, T. (1993). Effects of continuance, affective, and moral commitment on the withdrawal process: An evaluation of eight structural equation models. Academy of Management Journal, 36, 951–995.

Jolij, J., & Lamme, V. A. F. (2005). Repression of unconscious information by conscious processing: Evidence from affective blindsight induced by transcranial magnetic stimulation. Proceedings of the National Academy of Sciences, 102, 10747–10751.

Karg, M., Kühnlenz, K., & Buss, M. (2010). Recognition of affect based on gait patterns. IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 40, 1050–1061.

Kim, S. H. (2015). (A)study of character design for emoji character of the smart-phone messenger: Focus on formative characteristic of the paid emoji character. (Master’s thesis). Hongik University, Seoul, Korea.

Knapp, M. L. (1978). Nonverbal communication in human interaction. New York, NY: Holt Rinehart and Winston.

Lavidge, R. J., & Steiner, G. A. (1961). A model for predictive measurements of advertising effectiveness. Journal of Marketing, 25(6), 59.

Lee, J.-H. (2007). A study of physical movement design to enhance emotional value of a product: With emphasis on the development of emotion-movement framework and emotion palpus. (Master’s thesis). KAIST, Seoul, Korea.

Leighton, J., Bird, G., Orsini, C., & Heyes, C. (2010). Social attitudes modulate automatic imitation. Journal of Experimental Social Psychology, 46(6), 905-910.

Mattila, A. S. (2000). The role of narratives in the advertising of experiential services. Journal of Service Research, 3(1), 35–45.

Mehrabian, A., & Epstein, N. (1972). A measure of emotional empathy. Journal of Personality, 40(4), 525-543

MSIT. 2017 Summarized report on internet use. Retrieved from https://www.msit.go.kr/cms/www/m_con/news/report/__icsFiles/afieldfile/2018/01/31/2017%20%EC%9D%B8%ED%84%B0%EB%84%B7%EC%9D%B4%EC%9A%A9%EC%8B%A4%ED%83%9C%EC%A1%B0%EC%82%AC%20%EC%9A%94%EC%95%BD%EB%B3%B4%EA%B3%A0%EC%84%9C(%EC%B5%9C%EC%A2%85).pdf

Quintly. (2017). New Instagram emoji study H1. Retrieved from https://www.quintly.com/blog/instagram-emoji-study

O'Neill, B. (2013). Mirror, mirror on the screen, what does all this ASCII mean?: A pilot study of spontaneous facial mirroring of emotions. The Arbutus Review, 4(1), 19.

Pollick, F. E., Paterson, H. M., Bruderlin, A., & Sanford, A. J. (2001). Perceiving affect from arm movement. Cognition, 82(2), B51–B61.

Redfern, B. (1965). Introducing laban art of movement. London: Dance Books, Ltd.

Russell, J. A. (1983). Pancultural aspects of the human conceptual organization of emotions. Journal of Personality and Social Psychology, 45, 1281–1288.

Shawn, T. (1963). Every little movement: A book about françois delsarte. New York, NY: Dance Horizons.

Shin, J. E., & Eune, J. H. (2017). Study on emoji design elements for emotional communication. Bulletin of Korean Society of Basic Design & Art, 18(6).

Spitzberg, B. H. (2006). Toward a theory of computer-mediated communication competence. Journal of Computer-Mediated Communication, 11, 629–666.

Suh, J. Y. (2015). Reconsider the notion of the context in temporal meaning in texte – French and Korean example studies. Association d'Études de la Culture Française et des Arts en France, 53.

Susskind, J. M., Littlewort, G., Bartlett, M. S., Movellan, J., & Anderson, A. K. (2007). Human and computer recognition of facial expressions of emotion. Neuropsychologia, 45(1), 152–162.

Volkova, E. P., Mohler, B. J., Dodds, T. J., Tesch, J., & Bülthoff, H. H. (2014). Emotion categorization of body expressions in narrative scenarios. Frontiers in Psychology, 5(623), 1-11.

Wallbott, H. G. (1998). Bodily expression of emotion. European Journal of Social Psychology, 28, 879–896.

Walter, H. (2012). Social Cognitive Neuroscience of Empathy: Concepts, Circuits, and Genes. Emotion Review, 4(1), 9-17

Walther, J. B. (2006). Nonverbal dynamics in computer‐mediated communication, or: (and the net: (‘s with you, :) and you:) alone. In V. Manusov & M. L. Patterson (Eds.), Handbook of nonverbal communication (pp. 461–479). California: Sage Publications.

Watzlawick, P., Bavelas, J. B., & Jackson, D. D. (1967). Pragmatics of human communication: A study of interactional patterns, pathologies, and paradoxes. New York, NY: Norton, W.W. & Company.

White, M. (1995). Preattentive analysis of facial expressions of emotion. Cognition and Emotion, 9(5), 439–460.

Wieser, M. J., & Brosch, T. (2012). Faces in context: A review and systematization of contextual influences on affective face processing. Frontiers in Psychology, 3, 471.

Xinhua, K. (2000). Comparative study on body language and linguistic language. Journal of Nanchang Institute of Aeronautical Technology (Social Science), 12, 76–77.

Yin, J. (2014). Body language classification and communicative context. In International conference on education, language, art and intercultural communication (ICELAIC-14). Atlantis Press.

Tables

Table 1

Demographic Characteristics

Classification

N

%

Gender

M

185

36.38

F

328

63.62

Residential area

Seoul,Gyeonggi

264

51.36

Metropolitan area

158

30.74

Others

92

17.90

Classification

N

M

SD

min

max

Age

514

23

1.811

19

27

Grade (university)

514

3

0.957

1

4

Number of daily uses

514

5~9

1.350

<1

15≤

Table 2

Difference in Empathy According to Nonverbal Expression Type

Classification

Obs

M

SD

T value

P value

Difference between Type A and Type B

Movement(x)+

Contextual information(x)type

2,056

3.143

0.89

1.026

0.3050

Movement(x)+

Contextual information(o)type

2,056

3.129

0.89

Difference between Type A and Type C

Movement(x)+

Contextual information(x)type

2,056

3.143

0.89

-3.289

0.0010

Movement(0)+

Contextual information(x)type

2,056

3.190

0.89

Difference between Type A and Type D

Movement(x)+

Contextual information(x)type

2,056

3.143

0.89

-5.628

0.0000

Movement(0)+

Contextual information(0)type

2,056

3.227

0.90

Difference between Type C and Type D

Movement(0)+

Contextual information(x)type

2,056

3.190

0.89

-2.719

0.0066

Movement(0)+

Contextual information(0)type

2,056

3.227

0.90

Table 3

Difference in Relation Commitment According to Nonverbal Expression Type

Classification

Obs

M

SD

T value

P value

Difference between Type A and Type B

Movement(x)+

Contextual information(x)type

2,056

3.165

0.82

-0.608

0.5430

Movement(x)+

Contextual information(o)type

2,056

3.172

0.84

Difference between Type A and Type C

Movement(x)+

Contextual information(x)type

2,056

3.165

0.82

-3.593

0.0003

Movement(0)+

Contextual information(x)type

2,056

3.205

0.82

Difference between Type A and Type D

Movement(x)+

Contextual information(x)type

2,056

3.165

0.82

-5.909

0.0000

Movement(0)+

Contextual information(0)type

2,056

3.234

0.85

Difference between Type C and Type D

Movement(0)+

Contextual information(x)type

2,056

3.205

0.82

-2.615

0.0090

Movement(0)+

Contextual information(0)type

2,056

3.234

0.85

Table 4

Difference in Empathy According to Nonverbal Expression Types and Emotional Types

Nonverbal expression type

Type A

Type B

Type C

Type D

Sample

size

F

Value

P

Value

Movement (x)+

Contextual information

(x)

Movement (x)+

Contextual information

(o)

Movement (o)+

Contextual information

(x)

Movement (o)+

Contextual information

(o)

Emotional type

Negative

3.078

3.053

3.150

3.167

4112

3.83

0.009

Positive

3.208

3.205

3.231

3.287

4112

1.88

0.130

Table 5

Difference in Relation Commitment According to Nonverbal Expression Types and Emotional Types

Nonverbal expression type

Type A

Type B

Type C

Type D

Sample

size

F

Value

P

Value

Movement (x)+

Contextual information

(x)

Movement (x)+

Contextual information

(o))

Movement (o)+

Contextual information

(x)

Movement (o)+

Contextual information

(o)

Emotional type

Negative

3.156

3.142

3.201

3.233

4112

2.54

0.055

Positive

3.174

3.202

3.210

3.235

4112

0.96

0.412

Table 6

Difference in Empathy According to Nonverbal Expression Types and Degree of Bodily Expression

Nonverbal expression type

Type A

Type B

Type C

Type D

Sample

size

F

Value

P

Value

Movement (x)+

Contextual information

(x)

Movement (x)+

Contextual information

(o)

Movement (o)+

Contextual information

(x)

Movement (o)+

Contextual information

(o)

Degree of bodily expression

Upper body

3.122

3.129

3.185

3.219

4112

2.7

0.044

Whole body

3.164

3.128

3.195

3.234

4112

2.66

0.046

Table 7

Difference in Relation Commitment According to Nonverbal Expression Types and Degree of Bodily Expression

Nonverbal expression type

Type A

Type B

Type C

Type D

Sample

size

F

Value

P

Value

Movement (x)+

Contextual information

(x)

Movement (x)+

Contextual information

(o)

Movement (o)+

Contextual information

(x)

Movement (o)+

Contextual information

(o)

Degree of bodily expression

Upper body

3.142

3.154

3.190

3.231

4112

2.41

0.065

Whole body

3.189

3.190

3.221

3.237

4112

0.84

0.474

Figures

Figure 1. Types of degree of bodily expressions of Emojis in CMC.

Figure 2. Difference in empathy and relation commitment for each emotional type according to nonverbal expression types.

Figure 3. Difference in empathy and relation commitment for each degree of bodily expression according to nonverbal expression types.