13
ARTICLE IN PRESS JID: NEUCOM [m5G;July 10, 2017;22:10] Neurocomputing 000 (2017) 1–13 Contents lists available at ScienceDirect Neurocomputing journal homepage: www.elsevier.com/locate/neucom Personality affected robotic emotional model with associative memory for human-robot interaction Naoki Masuyama a , Chu Kiong Loo a,, Manjeevan Seera b a Faculty of Computer Science and Information Technology, University of Malaya, Kuala Lumpur, Malaysia b Faculty of Engineering, Computing and Science, Swinburne University of Technology (Sarawak Campus), Malaysia a r t i c l e i n f o Article history: Received 19 April 2016 Revised 16 September 2016 Accepted 28 June 2017 Available online xxx Communicated by Bo Shen Keywords: Associative memory Emotional model Human-robot interaction Personality a b s t r a c t The decision making process in communication is affected by internal and external factors from dynamic environments. Humans can perform a variety of behaviors in a similar situation, unlike robots. This paper discusses human psychological phenomena during communication from the point of view of internal and external factors, such as perception, memory, and emotional information. Based on these, we introduce the personality affected robotic emotional model and the emotion affected associative memory model for the robot. We organize an interactive robot system to provide suitable decisions for the robot. Results from interactive communication experiments indicate that the robot is able to perform different actions based on internal and external factors. © 2017 Elsevier B.V. All rights reserved. 1. Introduction Communication is a fundamental action for humans. In past decades, researchers in the field of psychology have tried to re- veal human psychological functions such as neuropsychology, de- velopmental psychology, and cognitive psychology [1,2]. On the other hand, computer scientists have attempted to establish human psychological functions on the computer, based on psychological knowledge. Specifically, in order to acquire the human functions, the research in intelligence as neural networks and fuzzy systems [3], and cognition and perception as image processing and voice recognition have been developed [4]. Facial and gesture expressions, being elements of multi-modal information depict a significant role to focus attention on subjects, while sharing the cognitive environment during during human- human interaction. In general, capability in sharing the cognitive environment with others is vital for a smooth communication. The continuity and relevance of subjects are among other factors in ex- panding the cognitive environment [5]. In the human-human in- teraction, associative memory is a functional and vital brain func- tion in handling continuity and relevance of the subject. To mimic this effective brain function, several types of artificial neural as- sociative memories with improvements have been introduced, and analyzed mathematically for the memory capacity, noise tolerance Corresponding author. E-mail address: [email protected] (C.K. Loo). and stability of network [6]. These models however are not con- sidered to influence of other functions of human brain. In human- human communication, the decision making process affects the logical thinking factors and also the emotional factors [7]. This emotional effect is one of the key differences between a human and a robot. The significance of interaction between emotion and memory, with its mutual relationships is discussed in [8]. In general, in communication, the recalled information might change depending on emotional effects. This psychological phenomena is called mood-congruency effect [9]. It is presumed that the emotional factors perform significant roles in human-robot interaction. In other words, we assume that the robot can provide suitable reactions to situations, if the robot performs multi-modal com- munication based on emotion affected associative memory. In discussing humanity from the psychology point of view, one of the significant elements is personality. The concept of personality is regarded as one of the essential factors of emotional response in the psychological field. In other words, personality gives individual differences among people in behavior patterns and cognitive process [10]. Based on differences in age and gender of human, it can be regarded that the impact and reaction based on external stimulus have diversity [11]. Sensibility of person is normally predicted based on their appearance, such as an elderly person is calm and a young female is more emotional. We assume that it depends on the communication partner, where the person would change the his/her behaviors unconsciously, based on the partner’s appearance to perform the appropriate responses as a kind of http://dx.doi.org/10.1016/j.neucom.2017.06.069 0925-2312/© 2017 Elsevier B.V. All rights reserved. Please cite this article as: N. Masuyama et al., Personality affected robotic emotional model with associative memory for human-robot interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.2017.06.069

ARTICLE IN PRESS - · PDF filethe research in intelligence as neural networks and fuzzy systems [3], ... Tzafestas [26] proposed the Quantum-Inspired Hopfield Associa- tive Memory

Embed Size (px)

Citation preview

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Neurocomputing 0 0 0 (2017) 1–13

Contents lists available at ScienceDirect

Neurocomputing

journal homepage: www.elsevier.com/locate/neucom

Personality affected robotic emotional model with associative memory

for human-robot interaction

Naoki Masuyama

a , Chu Kiong Loo

a , ∗, Manjeevan Seera

b

a Faculty of Computer Science and Information Technology, University of Malaya, Kuala Lumpur, Malaysia b Faculty of Engineering, Computing and Science, Swinburne University of Technology (Sarawak Campus), Malaysia

a r t i c l e i n f o

Article history:

Received 19 April 2016

Revised 16 September 2016

Accepted 28 June 2017

Available online xxx

Communicated by Bo Shen

Keywords:

Associative memory

Emotional model

Human-robot interaction

Personality

a b s t r a c t

The decision making process in communication is affected by internal and external factors from dynamic

environments. Humans can perform a variety of behaviors in a similar situation, unlike robots. This paper

discusses human psychological phenomena during communication from the point of view of internal and

external factors, such as perception, memory, and emotional information. Based on these, we introduce

the personality affected robotic emotional model and the emotion affected associative memory model for

the robot. We organize an interactive robot system to provide suitable decisions for the robot. Results

from interactive communication experiments indicate that the robot is able to perform different actions

based on internal and external factors.

© 2017 Elsevier B.V. All rights reserved.

1

d

v

v

o

p

k

t

[

r

i

w

h

e

c

p

t

t

t

s

a

a

s

h

l

e

a

w

c

o

m

f

o

r

m

d

s

r

t

d

p

c

s

p

h

0

. Introduction

Communication is a fundamental action for humans. In past

ecades, researchers in the field of psychology have tried to re-

eal human psychological functions such as neuropsychology, de-

elopmental psychology, and cognitive psychology [1,2] . On the

ther hand, computer scientists have attempted to establish human

sychological functions on the computer, based on psychological

nowledge. Specifically, in order to acquire the human functions,

he research in intelligence as neural networks and fuzzy systems

3] , and cognition and perception as image processing and voice

ecognition have been developed [4] .

Facial and gesture expressions, being elements of multi-modal

nformation depict a significant role to focus attention on subjects,

hile sharing the cognitive environment during during human-

uman interaction. In general, capability in sharing the cognitive

nvironment with others is vital for a smooth communication. The

ontinuity and relevance of subjects are among other factors in ex-

anding the cognitive environment [5] . In the human-human in-

eraction, associative memory is a functional and vital brain func-

ion in handling continuity and relevance of the subject. To mimic

his effective brain function, several types of artificial neural as-

ociative memories with improvements have been introduced, and

nalyzed mathematically for the memory capacity, noise tolerance

∗ Corresponding author.

E-mail address: [email protected] (C.K. Loo).

c

d

c

a

ttp://dx.doi.org/10.1016/j.neucom.2017.06.069

925-2312/© 2017 Elsevier B.V. All rights reserved.

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

nd stability of network [6] . These models however are not con-

idered to influence of other functions of human brain. In human-

uman communication, the decision making process affects the

ogical thinking factors and also the emotional factors [7] . This

motional effect is one of the key differences between a human

nd a robot.

The significance of interaction between emotion and memory,

ith its mutual relationships is discussed in [8] . In general, in

ommunication, the recalled information might change depending

n emotional effects. This psychological phenomena is called

ood-congruency effect [9] . It is presumed that the emotional

actors perform significant roles in human-robot interaction. In

ther words, we assume that the robot can provide suitable

eactions to situations, if the robot performs multi-modal com-

unication based on emotion affected associative memory. In

iscussing humanity from the psychology point of view, one of the

ignificant elements is personality. The concept of personality is

egarded as one of the essential factors of emotional response in

he psychological field. In other words, personality gives individual

ifferences among people in behavior patterns and cognitive

rocess [10] . Based on differences in age and gender of human, it

an be regarded that the impact and reaction based on external

timulus have diversity [11] . Sensibility of person is normally

redicted based on their appearance, such as an elderly person is

alm and a young female is more emotional. We assume that it

epends on the communication partner, where the person would

hange the his/her behaviors unconsciously, based on the partner’s

ppearance to perform the appropriate responses as a kind of

otic emotional model with associative memory for human-robot

2017.0 6.0 69

2 N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

b

p

K

c

o

p

i

a

o

T

t

i

n

s

e

s

m

c

t

2

f

s

s

t

A

C

a

p

h

v

p

[

n

c

t

T

i

e

p

i

a

s

P

e

e

b

c

s

d

e

C

i

e

n

2

b

T

social interaction. Due to the settings of personality factors, it

is possible to arrange the unique and suitable robot to specific

conditions. This results in a more active communication between

the robot and the human would be more active.

If the robot only takes into account the external stimulus, the

reactions from the robot are always the same, corresponding to

the stimulus. In contrast, considering the internal state (emotional

state in this manuscript) as another type of stimulus, the robot is

able to perform the different reactions based on history of stim-

ulus. Due to the personality factors in the emotional model, we

can easily arrange the individual/unique robot. From the viewpoint

of natural and smooth communication, several studies [12,13] have

shown that the emotional information during communication per-

forms significant roles, such as behavior selection, which makes

the robot more attractive and changes the meaning of the cognitive

information.

In this paper, we propose a human-robot interaction system

with associative memory that is dominated by personality affected

robotic emotional model. The proposed system is able to handle

several modalities, in order to communicate with humans. In ad-

dition, based on multi-modal inputs, the robotic emotional infor-

mation will be generated, and recalled information in associative

memory will be changed based on internal states in the robot. We

regard this is one of the functional implementations of associative

memory from the point of view of the psychology.

The contribution of this paper is the development of a robotic

decision making system, based on human psychological phenom-

ena during communication. We take into account internal and ex-

ternal factors, such as perception, memory, and emotional informa-

tion. The proposed system is able to assign the preferable person-

ality to the robot, based on the estimated human biometric infor-

mation. These personality factors make significant differences for

the decision making process for the robot, which is defined by the

emotion affected associative memory which is characterized by the

mood congruency effect. As a result, the proposed system is able to

provide the human-like behaviors for the robot during interactions

with humans.

This paper is organized as follows. A literature review on asso-

ciative memory models, emotional models, and its applications are

first presented in Section 2 . Section 3 discusses the relationships

between memory and emotion in the human brain from the psy-

chology and brain science viewpoints. Section 4 presents the com-

putational models of memory and emotion. In Section 5 , configu-

ration of proposed interactive robot system and its cognitive intel-

ligence are detailed. Experimental results under several conditions

are discussed in Section 6 . Concluding remarks are finally given in

Section 7 .

2. Literature review

In this section, we present representative models of artificial

neural associative memory models and emotional models for hu-

manoids. In addition, we introduce the several interactive robot

systems which integrate memory and emotion functions.

2.1. Artificial neural associative memories

Fundamental structures of associative memory such as Hopfield

Associative Memory (HAM), Bi-/Multi-directional Associative Mem-

ory (BAM, MAM) are introduced by Hopfield [14] , Kosko [15] , and

Hagiwara [16] , respectively. Based on the complex-valued artificial

neuron model [17] , Complex-Valued HAM (CHAM), BAM (CBAM)

and MAM (CMAM) are introduced by Jankowski [18] , Donq-Liang

[19] , and Kobayashi [20] , respectively. These fundamental mod-

els however suffered from low memory capacity and recall re-

liability. To improve the abilities of model, several studies have

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

een introduced. Lee et al. [21] showed that the projection matrix

roposed by Personnaz can be generalized to a complex domain.

obayashi [22] applied a pseudo-relaxation learning algorithm. The

omplex-valued chaotic behavior models are also considered based

n real-valued models [23,24] . Though these models show the su-

erior abilities, the complexity of the model structures are greatly

ncreased.

In another approach, the concept of quantum mechanics is

pplied to artificial neural networks [25] . Based on the features

f quantum mechanics as parallelism and unitarity, Rigatos and

zafestas [26] proposed the Quantum-Inspired Hopfield Associa-

ive Memory model (QHAM). This model demonstrates quantum

nformation processing in neural structures results in an expo-

ential increase in storage capacity, and can explain the exten-

ive memory and inferencing capabilities of humans. It is how-

ver limited to auto-association and binary state processing. Ma-

uyama and Loo have solved the problems of QHAM by introducing

ultidirectional architecture and complex-valued neuron model,

alled Quantum-Inspired Complex-Valued Multidirectional Associa-

ive Memory (QCMAM) [27] .

.2. Emotional models

Emotion is a complex human function which can be discussed

rom physiological, cognitive, and motivational processes. Several

tudies of emotional phenomena in psychological field operate as-

uming a fixed number of emotions [28] , the regions or proto-

ypical trajectories based on the results from human studies [29] .

simplified method of the emotional system, Ortony, Clore, and

ollins (OCC) [30] introduced the OCC model to identify emotional

ttribution of events, object desirability and praiseworthiness.

In terms of emotion, personality is an important factors. In the

ast, several psychologists have discussed on relationships between

uman emotional factor and personality factor [31,32] . From the

iew point of behaviors [33] , various rule-based models [34] and

robabilistic models [35] have been introduced. Costa and McCrae

36] introduced the OCEAN model based on five factors, i.e. open-

ess, conscientiousness, extraversion, agreeableness, and neuroti-

ism. Mehrabian utilized the five factors of personality to represent

he Pleasant-Arousal-Dominance (PAD) temperament model [37] .

he relationship between five factors of personality and PAD model

s derived through the linear regression analysis [38] .

From studies in psychological fields, several computational

motion models have been introduced [39] . Han et al. [40] em-

loyed five factors of personality to a 2D (pleasure-arousal) scal-

ng model, introduced by Russell and Bullock [41] to represent

robotic emotional model. This model generates a robot mood

tate from the human facial expression information. Smith and

etty [42] described that human beings have tendency to recall

motions from information based on their knowledge and experi-

nce. As mentioned in Section 1 , human emotional information and

ehaviors are affected stimulus from the environment and the

ommunication partner. Naveh et al. [43] studied the effects to as-

ociation process from not only emotional factor, but also the gen-

er and age factors. Rosenthal et al. [44] studied the differences of

motional reactions towards a robot based on specific situations.

hen et al. [45] developed the model that is able to understand the

ntention in human-robot interaction, which is mainly obtained by

motion, with identification information such as age, gender, and

ationality.

.3. Interaction robot systems

It is considered that associative memory function in the human

rain plays a significant role in the human-human communication.

he association process is affected by internal and external factors,

otic emotional model with associative memory for human-robot

2017.0 6.0 69

N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13 3

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

s

i

s

h

t

y

c

b

r

a

r

t

a

c

[

m

a

a

e

c

m

e

e

t

i

t

t

I

o

w

t

w

t

p

3

h

t

i

3

c

u

n

e

m

j

i

t

i

i

t

f

[

t

t

f

i

e

o

s

t

t

c

s

t

K

o

n

[

t

t

s

m

h

a

s

a

t

p

3

t

b

t

t

c

m

e

o

s

s

b

a

m

e

e

(

I

B

s

t

m

t

c

o

m

t

t

B

3

m

w

a

t

s

uch as emotion [8] . Conventionally, several types of robot system,

n which the emotion affected associative memory selects the

uitable robot behavior have been introduced to improve the

uman-robot interaction. Hiolle et al. [46] developed the system

o elicit care-giving behavior in human-robot interaction with

ounger/adults people using associative memory model. The robot

ollects the multi-modal information to associate the suitable

ehavior for the situations. Rumbell et al. [47] discussed and

eviewed about emotional mechanisms that are often used in

rtificial agents as a method of improving action selection.

Itoh et al. [48] developed an emotion expression humanoid

obot and its interactive system. The system is able to associate

he emotional expression for robot from human behavior using

chaotic complex-valued associative memory that the output is

ontrolled by the mental model and the robot personality. Yi et al.

49] regarded that associative memory is essential to realize man-

achine cooperation in the natural interaction between human

nd robot. They developed the emotional robot platform with the

ssociative memory model which is controlled by the dynamic

motional states. Valverde et al. [50] studied and proposed the

omputation model for associative memory and emotional infor-

ation, using ideas inspired by neuroscience research into neural-

ndocrine systems interaction. The model is able to create the

motional memory of the robot. The emotional memory is utilized

o predict future emotional states based on past experiences, and

t would be affected to select the emotional response actions by

he robot.

In previous work, we have developed an interactive robot sys-

em which is based on QCMAM and discrete emotion model [51] .

n [51] , we need to predefine a number of sensitive parameters, in

rder to define the individuality of robot. However, in this study,

e introduce the universal emotion model with personality factors

o the robot system, which is detailed in Section 4.2 . Furthermore,

e integrate the biometric information for improving the ability

o understand the environment of interaction space. The details of

roposed interactive robot system are presented in Section 5 .

. Memory and emotion in brain

In this section, the effects of emotion to memory function in

uman brain from the psychological view are introduced. In addi-

ion, the functions of emotion affected associative memory model

s discussed.

.1. Associative memory with emotional effect

The feature of the research on emotion and memory in the psy-

hological field not only considers emotional valence of the stim-

lus, but also on the emotional state of the subject itself, simulta-

eously. The pioneering research by Bower [52] demonstrates the

motional state influences to the human memory. In the experi-

ent, subjects are artificially set to have Happy or Sad state. Sub-

ects then tried to memorize the specific words, associate the event

n the past, and remember sentences they wrote in a diary. From

he series of experiments, it is revealed that we are feeling a pos-

tive emotion, we are likely to recall positive information depend-

ng on individual knowledge and experience. Likewise, we have

he tendency to recall the negative information during negative

eelings.

The relationships between memory and emotion presented in

8] provide noteworthy perspective to associative memory func-

ion. In short, the memory information can be recalled using emo-

ional information. Likewise, emotional information can be recalled

rom memory information. This kind of psychological phenomena

s known as the mood-congruency effect [9] .

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

In the past, several studies have shown the influence and prop-

rties of a mood-congruency effect in the human activities based

n the analysis of the brain signal processing and several human

tudies. Lewis et al. [53] assumed that the mood congruent facili-

ation is due to the mood-related reactivation at retrieval of emo-

ional responses which are linked to valenced information at en-

oding in the associative memory. In the experiment, they pre-

ented subjects with positive and negative words and manipulated

heir mood while monitoring brain activity by fMRI. Pierce and

ensinger [54] studied the effects of emotional valence and arousal

n associative connection based on human study by using the

egative, positive, and neutral word pairs. Murray and Kensinger

55] considered the relationships of associative memory and emo-

ion that takes into account the factors of age and gender by using

he word pairs, in the case of emotional and non-emotional, re-

pectively. Egidi and Nusbaum [56] utilized visual and sound infor-

ation to study a mood-congruency effect for a discourse compre-

ension based on analysis of EEG signal during experiment. Ravaja

nd Kätsyri [57] revealed the relationships between facial expres-

ion and mood congruency effect based on facial electromyography

nalysis. From these, it can be noted that the impact and effec-

iveness of a mood-congruency effect to the association and recall

rocesses with the multi-modal information in the human brain.

.2. Associative memory based on mood-congruency effect

In regards to memory, Collins and Loftus [58] have insisted

hat the memory can be represented by the topological network

ased on nodes (memory) and connections between nodes (rela-

ionships). From the psychological perspective, the emotional con-

ext and memory predict that mood-related memory facilitation

an be explained as an associative memory effect [52] . Further-

ore, Lewis et al. [53] have reported its validity from the brain sci-

nce perspective.

Based on the studies in psychology and brain science, the mem-

ry can be mapped by memory nodes and its connective relation-

hips as depicted in Fig. 1 (a). In Fig. 1 (a), each information is repre-

ented by node, and relationships between nodes are represented

y edges. Here, the memory nodes, which are relating to Red Ball,

re focused as an example. Taking into account the emotion infor-

ation, the memory can be illustrated as Fig. 1 (b). In general, the

motional attribution of each node is defined by individual knowl-

dge and experience.

From the associative memory viewpoint, the key information

Red Ball) is able to associate the information in connected nodes.

n case of Fig. 1 (a), all nodes which are connected with the Red

all have the same possibility to be associated. Furthermore, con-

idering the emotion information, each node can be labeled to

he positive, negative or neutral attribution based on experience,

emory and feelings of individuals as Fig. 1 (b). In this condition,

he different association possibility can be assigned to each node

orresponding to the emotion information based on the function

f mood-congruency effect. For instance, the human who has the

emory as Fig. 1 (b), and has the positive emotional information,

he nodes of Apple and Fruits are more likely to be associated than

he nodes of Blood and Knife from the key information as the Red

all.

.3. Affinity of emotional information and complex-valued associative

emory model

In psychology, emotional information can be considered as

aveform information [59] , e.g. the emotion state is regarded as

short-term emotional state, and mood state is regarded as a long

erm emotional state, respectively. On the other hand, a number of

tudies of the human brain indicate that the neurons in the brain

otic emotional model with associative memory for human-robot

2017.0 6.0 69

4 N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Fig. 1. Region for Red Ball in association map; (a) without emotional factors, (b) with emotional factors. (For interpretation of the references to color in this figure legend,

the reader is referred to the web version of this article).

a

H

a

t

i

l

m

X

r

r

o

p=1 j=1 i =1

transmit electrical activity, which is the basis of neural oscillation

[60] . Several studies have shown that the brain activity and emo-

tional information are closely related each other [61] .

As shown in Section 3.1 , the association process with a mood-

congruency effect can be seen from the signals of brain activi-

ties. Thus, it is acceptable to represent the emotional informa-

tion by the state of the neurons. In terms of the artificial neuron

model, the complex-valued model is regarded as one of the oscilla-

tor models, due to the phase information. In general, the complex-

valued model has the rich expressive power than the real-valued

model. Thus, from the point of view of the functions of neuron in

the brain, the complex-valued model, that has the ability to han-

dle the oscillator model, is an appropriate and affinity model as

the neuron in brain than the real-valued model.

Based on these discussions, it is clear that the interactions

between memory and emotion functions provide interesting and

significant functions for the humans. In this paper, we apply

the complex-valued associative memory model and the emotional

model to simulate a mood-congruency effect for an interactive

robot system, which is presented in Section 5 , to improve the hu-

manity and communication ability of the robot. In the system,

the emotional information is handled by the phase information of

the neurons in the complex-valued associative memory model. It

depends on the emotional information, the association process is

controlled similar with a mood-congruency effect.

4. Computational models of memory and emotion

This section presents the computational models of memory and

emotion for the proposed interactive robot system. In terms of as-

sociative memory, we utilize the QCMAM [27] due to its supe-

rior memory capacity and noise tolerance. In regards to emotion

model, we introduce the personality affected universal emotion

model based on PAD architecture, which is detailed in Section 4.2 .

4.1. Fundamentals of quantum-inspired complex-valued

multidirectional associative memory

We utilized a QCMAM to an association module for its su-

perior abilities in terms of the memory capacity and the noise

tolerance. The model shows quantum information processing in

neural structures, which results in an exponential growth in its

storage capacity. It can also describe inferencing capabilities and

extensive memory of humans. The model is applied on fuzzy infer-

ence to weight matrix, determined by one-shot learning as Hebb-

like learning to satisfy parallelism and unitarity [27] .

In this section, details of the Quantum-Inspired Complex-Valued

Multidirectional Associative Memory (QCMAM) is presented. In

general, multidirectional model is regarded as a multiple combi-

nation of bidirectional models. Thus, the following presentations

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

re focused the network between the αth layer and the βth layer.

ere, the model that has L -layers is considered, and the associ-

ted layer is defined as αth layer, and the other layers are referred

o be βth layers (β = 1 , 2 , . . . , L ; α � = β), respectively. The neurons

n each layer continue to be subject to cyclic updates until the

ayer reaches an equilibrium. Let k of the original complex-valued

emory pairs {

X

(k ) (1)

, X

(k ) (2)

, . . . , X

(k ) (L )

}are stored in QCMAM, where

=

[p 1 + jq 1 , p 2 + j q 2 , . . . , p N (l)

+ j q (l)

]( p, q ∈ R, l = 1 , 2 , . . . , L ) , L

epresents the number of layers, N ( l ) denotes the number of neu-

ons in l th layer, and subscript j denotes the imaginary unit. Based

n above conditions, QCMAM is formalized as follows;

• αth layer to βth layers ⎧ ⎪ ⎪ ⎪ ⎨

⎪ ⎪ ⎪ ⎩

S (k ) (β)

=

L ∑

α=1 α � = β

N (β) ∑

j=1

M (α) ∑

i =1

W

∗i j(αβ) x

(k ) i (α)

, (β = 1 , 2 , . . . , L ;β � = α) (1a)

X

(k ) (β)

=φ(

S (k ) (β)

)(1b)

• βth layers to αth layer

⎧ ⎪ ⎪ ⎨

⎪ ⎪ ⎩

U

(k ) (α)

=

L ∑

β=1 β � = α

M (α) ∑

i =1

N (β) ∑

j=1

W i j(αβ) x (k ) j(β)

, (α = 1 , 2 , . . . , L ;α � = β) (2a)

X

(k ) (α)

=φ(U

(k ) (α)

)(2b)

where, S and U denote temporal states of associated patterns

in αth layer and βth layer, respectively. The exponential aster-

isk denotes the conjugate transpose operation. The φ( ·) denotes

an activation function based on a complex unit circle that is de-

picted as Fig. 2 . The activation function for the complex-valued

model is formalized as follows;

φ(Z) =

⎧ ⎪ ⎨

⎪ ⎩

exp ( j2 πn/q ) ,

If ∣∣Arg

{Z

exp ( j2 πn/q )

}∣∣ < π/q and Z � = 0

previous state , If Z = 0

(3)

where, Arg( ·) denotes the phase angle which is taken to range

over (−π, π) . r denotes quantization value on the complex unit

circle, n takes an integer. Assume that z 0 , z 1 , . . . , z r−1 , are r

quantized values. Here, the complex number Z will be defined

as z 1 that is closest to Z .

Here, the weight connections W

∗ and W are as follows;

• αth layer to βth layers

W

∗(αβ) =

1

k

k ∑

N (β) ∑

M (α) ∑

s (p) ∗j(β)

s (p) i (α)

. (4)

otic emotional model with associative memory for human-robot

2017.0 6.0 69

N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13 5

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Fig. 2. Discrete complex unit circle.

w

k

a

t

l

l

b

t

a

4

t

t

e

i

I

t

a

a

4

m

t

a

a

fl

r

s

T

4

u

D

v

Table 1

Mapping of prototype emotions based on pleasant-arousal-dominance space

[37] .

Factor (Pleasant, arousal, dominance)

Happy (0.81, 0.67, 0.46)

Sad ( −0.63, −0.27, −0.13)

Anger ( −0.51, 0.59, 0.25)

Disgust ( −0.60, 0.35, 0.11)

Surprise (0.40, 0.67, −0.13)

Table 2

Five factors of personality [36] .

Factor Descriptions

Openness Open mindedness, interest in culture.

Conscientiousness Organized, persistent in achieving goals.

Extraversion Preference for and behavior in social situations.

Agreeableness Interactions with others.

Neuroticism Tendency to experience negative thoughts.

a

a

a

e

M

t

i

T

[

r

4

P

T

w

P

P

P

w

(

t

fi

s

a

o

4

f

m

e

a

w

e

P

a

p

m

a

t

o

i

• βth layers to αth layer

W (αβ) =

1

k

k ∑

p=1

M (α) ∑

i =1

N (β) ∑

j=1

s (p) ∗i (α)

s (p) j(β)

. (5)

here, the exponential asterisk denotes transpose operation.

denotes the number of complex-valued memory pairs. s ( α)

nd s ( β) represent orthonormalized complex-valued memory vec-

ors X in αth layer and βth layer, respectively, that are calcu-

ated by complex-valued Gram-Schmidt orthogonalization as fol-

ows; a 1 = A 1 / ‖ A 1 ‖ (p = 1) , b p = A p −∑ k −1

i = p−1 ( a i , A i ) a i and a p = p / ‖ b p ‖ (2 ≤ p ≤ k ), where A denotes complex-valued memory vec-

or, a and b denote the orthonormalized complex-valued vector

nd the orthogonalized complex-valued vector, respectively.

.2. Robotic emotional model with personality factors

The psychological background of relationships between emo-

ion and memory, and personality factors is presented in this sec-

ion. The three stages (i.e. core affect, emotion, and mood) robotic

motional model based on 3D (pleasure-arousal-dominance) scal-

ng model with OCEAN model as personality factors is introduced.

n this model, we consider the OCC model to appraise the emo-

ional information of objects, events, praiseworthiness, and desir-

bility to handle not only human facial expression information, but

lso the several modalities.

.2.1. Pleasure-arousal-dominance model

Mehrabian [37] proposed a Pleasure-Arousal-Dominance (PAD)

odel to describe a large variety of emotional state. In this model,

he pleasure defined as positive versus negative affective state, the

rousal defined in term of level of mental alertness and physical

ctivity, and the dominance is defined as feeling of control and in-

uences others. Based on the result of human study to reveal the

elationship between emotional states and PAD space, the six ba-

ic emotions are positioned as in Table 1 . We utilize the states in

able 1 as the weighting factor for updating the emotion states.

.2.2. Five factors of personality

Personality is one of the key factors to construct the individ-

al differences, such as perception, motivation, and cognition [62] .

ifferences of personality will influence and intervention to indi-

idual psychological phenomena, for instance, perceives emotion

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

nd emotional behaviors. In the past, several models of person-

lity have been introduced. One of the widely accepted person-

lity models is the five factors (i.e. openness, conscientiousness,

xtraversion, agreeableness, and neuroticism) model, proposed by

cCrae and Costa [63] . The five factors of personality were created

hrough a statistical procedure, which is used to analyze how rat-

ngs of various personality traits are correlated for general humans.

able 2 shows the five factors of personality and its descriptions

36] . The five factors model is used in this paper to represent the

obotic personality.

.2.3. Personality affected emotional factors

The relationships between the five factors of personality and

AD model are derived through the linear regression analysis [38] .

his result is summarized as three equations of temperament,

hich includes pleasure, arousal, and dominance as follows;

α = 0 . 21 E + 0 . 59 A + 0 . 19 N (6)

β = 0 . 15 O + 0 . 30 A − 0 . 57 N (7)

γ = 0 . 25 O + 0 . 17 C + 0 . 60 E − 0 . 32 A (8)

here P α , P β and P γ represent the value for pleasant axis

α-axis), arousal axis ( β-axis) and dominance axis ( γ -axis), respec-

ively. O, C, E, A , and N ( where , O, C, E, A, N ∈ [ −1 , 1] ) represent the

ve factors of personality as openness, conscientiousness, extraver-

ion, agreeableness and neuroticism, respectively. We utilize the

bove factors as the parameters to make an individual differences

f emotional reactions.

.2.4. Mathematical descriptions of robotic emotional model

In general, human emotional states are generated not only from

acial expression, but also from several stimulus in the environ-

ent. In addition, researches in the human psychology field have

xpected that the human emotional function is the result of core

ffect, emotion, and mood state composition [64] . In this paper,

e propose a three stage (core affect, emotion, and mood) robotic

motional model in which the emotion states are represented by

AD space. In this model, the OCC model [30] is considered for

ppraisal the emotional information of events, objects, desirability,

raiseworthiness to handle not only human facial expression infor-

ation, but also the several modalities, instead of the raw signal

nalysis, such as tone and volume of speech, to extract the emo-

ional information [65] . From the different standpoint, the concept

f Kansei engineering can be applied for translating feelings and

mpressions of information into arbitrary parameters [66] .

otic emotional model with associative memory for human-robot

2017.0 6.0 69

6 N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

U

Fig. 3. Configuration of interactive robot system.

I

I

I

h

a

m

a

e

m

a

a

s

s

i

t

s

i

fi

I

w

T

fi

5

s

o

c

t

o

t

o

i

The appraisal model based on OCC model can be shown as a

following vector;

� =

⎢ ⎢ ⎣

ω 1

. . . ω m

⎥ ⎥ ⎦

, ∀ i ∈ [1 , m ] : ω i ∈ [0 , 1] (9)

where m is the number of basic emotions. ω is defined based on

OCC model for any information.

First, the system observes multi-modal inputs I MI (t)

, such as

visual, sound and contextual information;

I MI (t) =

⎢ ⎢ ⎣

u 1

. . . u m

⎥ ⎥ ⎦

, ∀ i ∈ [1 , m ] : u i ∈ [0 , 1] (10)

where m is the number of basic emotions with each modal hav-

ing the same vector form. As a next step, the OCC model extracts

emotional intensities of each information as;

i j(t) = I MI i (t) �

T j(t) (11)

where � denotes emotional intensities that are determined by OCC

model. Exponential T denotes a transpose matrix. Diagonal ele-

ments of U ij ( t ) are utilized as the state of core affect I CA (t)

.

I CA (t) =

⎢ ⎢ ⎣

U

I 11(t)

. . .

U

I mm (t)

⎥ ⎥ ⎦

, ∀ i ∈ [1 , m ] : U i ∈ [0 , 1] (12)

In this paper, six basic emotions (i.e. Happy, Sad, Anger, Fear,

Disgust, and Surprise) are used. Therefore, core affect I CA (t)

will be

written as follows;

I CA (t)

=

⎢ ⎢ ⎢ ⎢ ⎣

ca H

ca S

ca A

ca F

ca D

ca Sur

⎥ ⎥ ⎥ ⎥ ⎦

=

⎢ ⎢ ⎢ ⎢ ⎣

intensity of Happy intensity of Sad

intensity of Anger intensity of Fear intensity of Disgust intensity of Surprise

⎥ ⎥ ⎥ ⎥ ⎦

,

ca ∈ [0 , 1]

(13)

Han et al. [40] proposed interactive robotic emotional variables

( α, β), which represents the reaction from current emotional

intensities on the pleasant-arousal plane. These variables are based

on neutral intensity, happiness intensity, anger intensity and sad-

ness intensity. Based on the above concept, we extend the emo-

tional variables ( α, β , γ ) for PAD model, such that;

αCA t = 0 . 81 ca H − 0 . 63 ca S − 0 . 51 ca A

− 0 . 64 ca F − 0 . 60 ca D + 0 . 40 ca Sur (14)

βCA t = 0 . 67 ca H − 0 . 27 ca S + 0 . 59 ca A

+ 0 . 70 ca F − 0 . 35 ca D + 0 . 67 ca Sur (15)

γ CA t = 0 . 46 ca H − 0 . 33 ca S + 0 . 25 ca A

−0 . 43 ca F + 0 . 11 ca D − 0 . 13 ca Sur (16)

where, variable α, β and γ represent the value for pleasant

axis ( α-axis), arousal axis ( β-axis) and dominance axis ( γ -axis),

respectively. In addition, coefficient of each emotional intensity is

determined by Table 1 .

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

The state of emotion I E (t)

is calculated for each axis as follows;

E α(t)

= tanh

[γ M

(I E α(t−a )

+ P α · αCA (t−a )

)](17)

E β(t)

= tanh

[ γ M

(I E β(t−a )

+ P β · βCA (t−a )

)] (18)

E γ(t)

= tanh

[ γ M

(I E γ(t−a )

+ P γ · γ CA (t−a )

)] (19)

ere, E α , E β and E γ denote a pleasant, an arousal and a dominance

xis, respectively. γ M (0 < γ M ≤ 1.0) is the suppression rate from

ood state. Depending on the combination of the current mood

nd core affect, the value of γ M will change (e.g. If the mood and

motional attribution are positive, γ M takes a high value. If the

ood is positive, but emotional attribution is negative, γ M takes

low value). a takes an arbitrary value as a time delay. P α , P βnd P γ are defined as Eqs. (6) , (7) and (8) , respectively. Here, the

tate of I E (t)

is mapped on PAD space, and it will be moved to the

pecific position based on the effect from core affect. For instance,

f the state of I E (t)

is close to position (α, β, γ ) = (0 . 81 , 0 . 67 , 0 . 46) ,

he intensity of Happy is higher than other emotions.

In general, the mood state will be taken positive or negative

tate [67] . We assume that it can be represented on pleasant axis

n Table 1 . Therefore, emotion-mood transfer coefficient αE t is de-

ned as follows;

αE t = I E α

(t) (20)

Finally, the mood state is determined as follows;

M

(t) = tanh

[γ M

(I M

(t−a ) + P α · αE (t−a )

)](21)

here, γ M (0 < γ M ≤ 1.0) is the suppression rate from mood state.

he variable a takes an arbitrary value as a time delay. P α is de-

ned as in Eq. (6) .

. Interactive robot system

This section presents the details of proposed interactive robot

ystem. The main objective of the proposed system is the devel-

pment of a robotic decision making system based on the mood-

ongruency effect, combining the associative memory and emo-

ional information. We discussed the importance and usefulness

f mood-congruency effect, and information affinity between emo-

ional information and complex-valued model in Section 3 . Based

n above discussion, we developed the robot system as depicted

n Fig. 3 .

otic emotional model with associative memory for human-robot

2017.0 6.0 69

N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13 7

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Table 3

Comparison of conventional human-robot interaction systems.

System Emotion model Mood state Personality Input Biometric Association Functions of

modality information model robot

Proposed PAD model Positive OCEAN Gesture, Age, Multidirectional Facial display

system (universal emotion) and model object, gender (one-to-many) and arms

negative voice,

facial Exp.

Previous Ekman model Positive None Gesture, None Bidirectional Facial display

work [51] (6 basic emotions) and object, (one-to-one) and arms

negative voice,

facial Exp.

Yorita and None None None Gesture, None Multidirectional Facial display

Kubota [69] object, (many-to-one) and arms

voice,

Hiolle et al. Arousal model None None Touch, None Multidirectional Animal robot

[46] (self-defined) gesture (many-to-one)

Itoh et al. Pleasant-arousal Positive None Voice None Bidirectional Upper body

[48] -certainty model and (one-to-one) of humanoid

(self-defined) negative (WE-4RII)

Yi et al. Emotional energy Positive None Object, None Bidirectional Facial robot

[49] model and voice

(self-defined) negative

Fig. 4. iPhonoid and example of face templates.

5

c

i

r

t

(

g

i

f

a

t

m

p

e

i

o

f

c

s

s

t

t

O

u

a

p

g

m

s

t

i

T

m

s

s

o

p

s

5

i

i

c

g

5

b

n

g

t

d

t

t

p

t

a

e

e

n

p

5

i

t

t

.1. System configuration

The system is composed of the robot, a Microsoft Kinect, a mi-

rophone, and a server Personal Computer (PC). We utilized an

Phonoid as the robot, developed by Kubota and Toda [68] . The

obot is made up of an iPhone with four servo motors. It simul-

aneously outputs audio and display, together with arm motion

Fig. 4 (a)).

From the multi-modal information, several cognitive intelli-

ence extract the meaningful information, and it will be utilized

n the emotion model and associative memory model. Specifically,

acial expression, object, gesture, voice and biometric information

re extracted as the cognitive information from the raw data cap-

ured by a Kinect and a microphone. Based on the cognitive infor-

ation, the emotion states for robot are generated depending on

ersonality factors. Finally, utilizing the cognitive information and

motion states, associative memory model recalls the robot behav-

ors based on the predefined relationships. The brief descriptions

f cognitive intelligences are presented in Section 5.2 .

Majority of the conventional human-robot interaction systems

ocus on specific modality, such as gesture-based, voice-based, fa-

ial expression based, or combination of voice and face-based

ystem. Therefore, these systems are able to handle the limited

ituations in communication. Most of systems are focused on de-

ecting the human emotional information to select the actions of

he robot, not to generate the emotional information for the robot.

n the other hand, the proposed system is able to generate the

niversal emotion information for the robot from the multi-modal

nd biometric information to select the actions of the robot. The

roposed system also accepts other sophisticated cognitive intelli-

ences, due to the definitions of emotional model and associative

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

emory model. The functional differences between conventional

ystems and the proposed system are summarized in Table 3 .

The ability to handle the large number of multi-modal informa-

ion is a significant factor to evaluate the usefulness and adaptabil-

ty of the interactive robot system. According to the comparison in

able 3 , the proposed system has better functionality, both infor-

ation cognition and processing. Furthermore, in order to empha-

ize the justification of proposed system, we provided the discus-

ion about the significance of reproducing the mutual relationship

f memory and emotion from the psychology and brain science

erspective in Section 3 . Therefore, we regard that the proposed

ystem would have a wider applicability than other systems.

.2. Cognitive intelligence for robot

We utilized a number of existing studies to collect the external

nformation for the proposed system. In this section, the cognitive

ntelligences for the multi-modal information (object, gesture, fa-

ial expression and voice) and the biometric information (age and

ender) are briefly introduced.

.2.1. Object recognition

In regards of object recognition, various types of algorithm have

een proposed, such as the template matching methods by dy-

amic programming (DP) [70] , cellular neural network [71] , and

enetic algorithm (GA) [72] . While cellular neural network requires

he exact templates for target, DP and GA can detect the targets

epending on similarity or distance based cost functions as an op-

imization problem. Basically, GA can be divided into a genera-

ional model (standard GA) and steady-state model (SSGA). SSGA

artially replaces a few individuals with offspring in a genera-

ion, not all individuals as standard GA. In particular, SSGA is suit-

ble to solve optimization problems in the dynamic or changing

nvironments [73] .

In general, the interactions are performed under the dynamic

nvironments. Furthermore, we utilized only the color information,

ot topological information for object recognition. Thus, we ap-

lied SSGA based object recognition [68] in the proposed system.

.2.2. Gesture recognition

In general, the context of hand gesture is quite flexible depend-

ng on the hand speed and its movement. Thus, it can be regarded

hat the significance of gesture recognition is to extract the spatio-

emporal information from the hand movements. It is well known

otic emotional model with associative memory for human-robot

2017.0 6.0 69

8 N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Table 4

Emotional intensity � of multi-modal information.

Attribution Emotional intensity �

Happy Sad Anger Fear Disgust Surprise

Object: Red Cir 0.7 0.0 0.0 0.0 0.0 0.1

Blue Rec 0.0 0.6 0.0 0.1 0.1 0.0

Gesture: Circle 0.6 0.0 0.0 0.0 0.0 0.1

Bye-Bye 0.0 0.1 0.0 0.7 0.1 0.0

Voice: Positive 0.7 0.0 0.0 0.0 0.1 0.0

Negative 0.0 0.1 0.0 0.1 0.7 0.0

Facial rep.: Happy Face 0.7 0.0 0.0 0.0 0.0 0.1

Sad Face 0.0 0.7 0.0 0.1 0.1 0.0

Table 5

Parameter settings of suppression ratio γ M .

Input Positive mood Negative mood

γ M Positive mood 0.95 0.60

Negative mood 0.60 0.95

Table 6

Four types of personality.

Factor Personality

Type 1 Type 2 Type 3 Type 4

Openness 0.50 0.90 0.50 0.20

Extraversion 0.90 0.70 0.20 0.10

Agreeableness 0.50 0.50 0.20 0.30

Neuroticism 0.20 0.30 −0 . 40 −0 . 40

Table 7

Determination of personality type based on biometric information.

Attribution Gender:

Male Female

Age: Young Type 1 Type 2

Adult Type 3 Type 4

6

6

t

m

m

t

r

d

l

i

H

t

r

6

p

F

t

s

f

r

D

p

i

that a Spiking Neural Network (SNN) is able to handle the spatial

and temporal context [74] . In addition, due to the spike response

model, it is able to reduce the computational cost comparing with

integrate-and-fire model. We utilized the SNN based hand gesture

recognition method that is introduced by Kubota and Toda [68] to

the proposed system.

5.2.3. Facial expression recognition

Facial expression plays an important role in communication to

express one’s emotional state directly. The robot system in this pa-

per utilizes Constrained Local Model (CLM) [75] based facial fea-

ture tracking framework [76] . The framework applied two clus-

tering algorithms, namely LeaderP [77] and Topological Gaussian

Adaptive Resonance Theory algorithm (TGART) [78] , for patch clus-

tering and shape clustering, respectively. The above CLM with

above clustering algorithms provide the human dynamic facial

features with superior accuracy, and reducing recognition errors

throughout tracking.

5.2.4. Voice recognition

Verbal communication is one of the essential things for human.

In the past, various studied have introduced. One of the established

open source software is Julius [79] , which runs in real time. In a

20,0 0 0-word reading test, it’s recognition accuracy rates was over

90%. In this paper, Julius is applied with Japanese language model

for voice recognition.

5.2.5. Biometric recognition

Biometric information is a significant factor in human emo-

tional information and behaviors. Specifically, we have tendency

to change our own attitudes/reactions based on facial appearances

of communication partner. Thus, we regard that this ability will

support in obtaining unique characteristics for robot. Eidinger et al.

[80] developed the age and gender estimation algorithm. The al-

gorithm is applied on Local Binary Patterns (LBP) to extract fea-

tures, and classification is performed using standard linear Sup-

port Vector Machine (SVM) that is trained by feature vectors of

LBP. Eidinger et al. [80] report extensive tests analyzing both the

difficulty levels of contemporary benchmarks as well as the capa-

bilities of their algorithm. These show the algorithm to outperform

state-of-the-art by a wide margin. In this paper, the algorithm is

customized to only detect male/female as gender and young/adult

as age for minimum configuration.

6. Experiment of interactive robot system

This section presents experimental results of the proposed in-

teractive robot system. The proposed system performs a number

of behaviors due to robot personality and biometric information

of communication partner. The following subsections describe ex-

perimental conditions that are related to the emotional model and

associative memory. Due to the limitations of cognitive abilities

of system, the practical relationships between each information or

movements are ignored. Thus, the simple symbolic information is

utilized for association. In practical terms, relationships are defined

from actual information, facts, common senses or personal inten-

sities. Here, we consider that the main focus of this experiment is

the association result from associative memory changes depending

on emotional factors and biometric information of communication

partner, which is characterized as a mood-congruency effect.

The experiment is divided to two parts; first, the processing of

multi-modal information into emotional information in the emo-

tion model is simulated. Next, based on multi-modal information

and emotional information, which are came from first one, associ-

ation process will be performed to determine the robot behaviors.

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

.1. Experimental conditions

.1.1. Conditions of personality affected emotional model

The behavior of robot is controlled by modules of robotic emo-

ional model and associative memory based on multi-modal infor-

ation and biometric information, as shown in Fig. 3 .

In each module, there are several predefined information. Each

ulti-modal input is assumed to be assigned the specific emo-

ional intensity � by OCC model, as in Table 4 . The suppression

atio γ M is defined as in Table 5 . Four types of personalities are

efined as in Table 6 . These personality factors are used to calcu-

ate Eqs. (6) and (7) . In the proposed system, biometric information

s affected to determine the types of personality factors as Table 7 .

ere, we show the four types of results based on personality.

Note that the parameters in Tables 4 –7 are all involved in emo-

ion model. It is worth noting that in order to define the unique

obot, we need only to change four parameters in Table 4 .

.1.2. Conditions of association process for robot behaviors

In this section, all inputs of associative memory module are

rovided from emotional model in the previous section, namely,

igs. 6 (a) and 8 . We utilize only male/female and young/adult as

he biometric information for simplicity of conditions. In the as-

ociative memory module, relationships between multi-modal in-

ormation and robot action are defined as shown in Table 8 . Each

elationship is labeled by ID for associative memory (A.M._ID).

epending on the association result and mood state, the robot

erforms several behaviors which identifying Act._ID as shown

n Table 9 . For instance, the robot recognizes a “Red Circle” as

otic emotional model with associative memory for human-robot

2017.0 6.0 69

N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13 9

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Table 8

Information relationships for QCMAM (A.M._ID).

A.M._ID Moods Input Input Associated information Act._ID

attribution Object Gesture Voice

0 P/N – No Input – – – 0

1 Positive Object Red Circle – Circle Happy 1

2 Negative Object Red Circle – Circle Sad 3

3 Positive Object Blue Rectangle – Bye-Bye Happy 2

4 Negative Object Blue Rectangle – Bye-Bye Sad 4

5 Positive Gesture Circle Red Circle – Happy 1

6 Negative Gesture Circle Red Circle – Sad 3

7 Positive Gesture Bye-Bye Blue Rectangle – Happy 2

8 Negative Gesture Bye-Bye Blue Rectangle – Sad 4

9 Positive Voice Happy Red Circle Circle – 1

10 Negative Voice Happy Red Circle Circle – 3

11 Positive Voice Sad Blue Rectangle Bye-Bye – 2

12 Negative Voice Sad Blue Rectangle Bye-Bye – 4

Fig. 5. Example of robot actions with a neutral face.

Table 9

Definitions of robot action ID (Act._ID).

Act._ID Robot action

Face Gesture Voice

0 Neutral – –

1 Happy Circle Happy

2 Sad Up&Down –

3 Happy Up&Down –

4 Sad Bye-Bye Sad

a

t

a

T

p

t

I

a

t

Table 10

Definitions of input information ID (IN_ID).

IN_ID Information attribution

Face Object Gesture Voice

0 Neutral No info. No info. No info.

1 Happy Red Circle Circle Happy

2 Sad Blue Rectangle Bye-Bye Sad

6

6

a

F

s

b

t

{

f

(

g

t

c

s

s

h

r

a

s

i

c

d

n object information with positive mood state, the robot recalls

Circle” and “Happy” as gestural and voice information, respec-

ively. The robot then performs circle-wise gesture by his arm,

nd utter a “Happy” with displaying a Happy face as Act._ID 1 in

able 9 .

Fig. 5 shows gestural actions examples by the robot. In this ex-

eriment, we assigned specific ID (IN_ID) to multi-modal informa-

ion as in Table 10 for visualization. Here, information belonging to

N_ID 0 has a neutral attribution, IN_ID 1 has a positive emotional

ttribution, IN_ID 2 has a negative emotional attribution, respec-

ively.

Fig. 6. Internal states of emotional model. (a)

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

.2. Experimental results

.2.1. Results of emotion information processing in personality

ffected emotional model

The history of multi-modal information is shown in Fig. 6 (a). In

ig. 6 (a), IN_ID denotes the types of multi-modal information as

hown in Table 10 .

Fig. 6 (b) shows the intensity of core affect, which is calculated

y the pre-defined parameters in Table 4 . Thus, as an example,

he first stimuli in Figs. 6 (b) has the intensity of core affect as

Sad : 0.6, Fear : 0.1, Disgust : 0.1} at same time, which are calculated

rom the parameters in Table 4 corresponding to the Blue Rectangle

first stimuli in Fig. 6 (a)).

Based on the intensity of core affect, the emotion states are

enerated as in Fig. 7 . Due to the influence of personality factors,

he four types of results can be defined from the same intensity of

ore affect. Furthermore, from the emotion information, the mood

tates are generated as in Fig. 8 corresponding to 4 types of per-

onality.

As defined in Tables 6 and 7 , we assume that the young woman

as the rich emotional sensitivity, a high adaptability to the envi-

onment, while the adult man has a calm character, and tolerance

bout influence from the environment. From the definitions of per-

onality factors as Table 6 , we assume that Type 1 and 2 personal-

ties indicate the similar properties, as well as Type 3 and 4. Here,

omparing with Type 2 and 3 personalities as in Fig. 7 (b) and (c),

ue to Type 2 has high value of Openness and Extraversion than

Multi-modal inputs and (b) Core affect.

otic emotional model with associative memory for human-robot

2017.0 6.0 69

10 N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Fig. 7. Trajectory of emotion states based on four types of personality.

Fig. 8. Trajectory of mood states based on four types of personality.

T

t

t

b

m

T

o

i

Section 6.2.2 .

Type 3, Type 2 personality shows the wide range of trajectory than

Type 3 one from the same input information.

The mood state is calculated based on emotion state by

Eq. (21) . Fig. 8 shows the trajectory of mood state. Due to the dif-

ferences of emotion states and personality factors, the mood state

in each types also generate different outputs. As mentioned, Type

2 personality has a sensitive tendency in the stimulus. Thus, the

mood state of Type 2 personality changes quickly and violently.

In contrast, Type 3 personality shows resistance to the stimulus.

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

herefore, the mood state of Type 3 personality changes with long

erm period than other types. Furthermore, the personality fac-

ors bring the noteworthy feature of emotional model which can

e seen at ranges (i), (ii) and (iii) in Fig. 8 . In these ranges, the

ood state of Type 1 and 2 indicate the positive state, while

ype 3 and 4 indicate the negative one. Due to this differences

f mood state, the association process is affected, which is shown

n Section 6.2.2 . The positions (a) to (f) are also mentioned in

otic emotional model with associative memory for human-robot

2017.0 6.0 69

N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13 11

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

Fig. 9. Association results of information relationships and corresponding robot actions.

p

f

p

c

a

6

t

a

s

c

m

t

(

i

s

f

i

p

t

I

t

r

p

i

c

i

7

a

w

g

t

t

g

m

s

t

o

p

f

f

i

c

a

c

r

a

t

t

A

I

C

From the results, it is clear that personality factors play an im-

ortant role to make the different emotional properties, and the

unctions of personality have successfully integrated in the pro-

osed emotional model. We consider that the robot is able to

hange the emotional reactions based on communication partner

ppearance for providing the appropriate responses.

.2.2. Results of robot interaction based on mood-congruency effect

This section presents the association process of the robot ac-

ion depending on the emotional factor. Fig. 9 (a) shows the associ-

tion results (A.M._ID) in associative memory module, and Fig. 9 (b)

hows the corresponding robot actions (Act._ID) based on asso-

iation, respectively. The associations are performed using multi-

odal inputs and emotional information based on predefined rela-

ionships as in Table 8 . In Figs. 9 (a) and 9 (b), the positions of (a) to

f) are plotted in same step point in Fig. 8 , and these positions are

ncluded in ranges (i), (ii) and (iii) as Fig. 8 , respectively.

Due to the differences of mood attribution, the association re-

ults are affected. In this paper, from the definitions of personality

actors as in Table 6 , it is assumed that Type 1 and 2 personalities

ndicate the similar properties, as well as Type 3 and 4. Thus, the

ositions (a), (c), (d) and (e) of Type 1 and 2 personalities show

he same association results, which is different from Type 3 and 4.

n particular, the positions (b) and (f) of Type 3 personality show

he unique results, respectively. In the same time, corresponding

obot actions are also affected as Fig. 9 (b).

From these results, it can be considered that the difference of

ersonality is able to manage the different behaviors of robot. It

s assumed that the suitable and actual information relationships

an be prepared, the robot is able to provide preferable reactions

n any situations.

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

. Conclusions

In this paper, the relationships between associative memory

nd emotional factors from the point of view of the psychology

ith its effect in communication are discussed. Cognitive intelli-

ences, associative memory model and personality affected emo-

ional model are introduced for the interactive robot system. From

he experimental results, it is regarded that the robot is able to

enerate the different emotional response from multi-modal infor-

ation depending on personality factors. In addition, results also

how that the output of associative memory is affected from emo-

ional factors. Thus, the individual robot can perform various types

f responses. Even if the experimental conditions are not under

ractical conditions (applied symbolic information and simple in-

ormation relationships), the system shows that it can generate dif-

erent the emotional information and responses from multi-modal

nformation based on predefined conditions. We assume that if the

ognitive intelligences are well developed, there is a possibility to

pply the system with the complex practical information, fact and

ommon sense. As a result, the interactions between human and

obot would be more natural and active. For future work, work on

n interaction system with multiple robots will be investigated. In

his situation, we are able to assign the specific roles and charac-

ers to individual robots.

cknowledgments

This research is supported by Fellowship Scheme under High

mpact Research UM.C/625/1/HIR/MOHE/FCSIT/10 and UM Grand

hallenge Grant GC003A-14HTM from the University of Malaya.

otic emotional model with associative memory for human-robot

2017.0 6.0 69

12 N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

[

References

[1] M. Minsky , The Society of Mind, Simon and Schuster, New York, 1986 .

[2] R. Pfeifer , C. Scheier , Understanding Intelligence, The MIT Press, 1999 .

[3] A.J. Maren , C.T. Harston , R.M. Pap , Handbook of Neural Computing Applica-tions, Academic Press, 2014 .

[4] L. Besacier , E. Barnard , A. Karpov , T. Schultz , Automatic speech recognition forunder-resourced languages: a survey, Speech Commun. 56 (2014) 85–100 .

[5] D. Sperber , D. Wilson , Relevance – Communication and Cognition, Oxford Uni-versity Press, 1995 .

[6] H. Zhang , Z. Wang , D. Liu , A comprehensive review of stability analysis of con-

tinuous-time recurrent neural networks, IEEE Trans. Neural Netw. Learn. Syst.25 (7) (2014) 1229–1262 .

[7] A. Bechara , H. Damasio , A.R. Damasio , Emotion, decision making and the or-bitofrontal cortex, Cereb. Cortex 10 (3) (20 0 0) 295–307 .

[8] D.E. Reisberg , P.E. Hertel , Memory and Emotion, Oxford University Press, 2004 .[9] P. Ekkekakis , The Measurement of Affect, Mood, and Emotion: A Guide for

Health-behavioral Research, Cambridge University Press, 2013 . [10] W. Mischel , Introduction to Personality, vol. 250, London, 1993 .

[11] K.J. Michalska , K.D. Kinzler , J. Decety , Age-related sex differences in explicit

measures of empathy do not predict brain responses across childhood andadolescence, Dev. Cognit. Neurosci. 3 (2013) 22–32 .

[12] S.C. Gadanho , Learning behavior-selection by emotions and cognition in amulti-goal robot task, J. Mach. Learn. Res. 4 (Jul) (2003) 385–412 .

[13] J.-M. Fellous , M.A. Arbib , Who Needs Emotions?: The Brain Meets the Robot,Oxford University Press, 2005 .

[14] J.J. Hopfield , Neural networks and physical systems with emergent collective

computational abilities, in: Proceedings of the National Academy of Sciences79 (8) (1982) 2554–2558 .

[15] B. Kosko , Constructing an associative memory, Byte 12 (10) (1987) 137–144 . [16] M. Hagiwara , Multidirectional associative memory, in: Proceedings of the In-

ternational Joint Conference on Neural Networks, vol. 1, 1990, pp. 3–6 . [17] N. Aizenberg , Y.L. Ivaskiv , D. Pospelov , G. Hudiakov , Multiple-valued thresh-

old functions. II. Synthesis of the multi-valued threshold elements, Kibernetika

(Cybern.) 1 (1973) 53–66 . [18] S. Jankowski , A. Lozowski , J. Zurada , Complex-valued multistate neural associa-

tive memory, IEEE Trans. Neural Netw. 7 (6) (1996) 1491–1496 . [19] L. Donq-Liang , W. Wen-June , A multivalued bidirectional associative mem-

ory operating on a complex domain, Neural Netw. 11 (9) (1998) 1623–1635 .

[20] M. Kobayashi , H. Yamazaki , Complex-valued multidirectional associative mem-

ory, Electr. Eng. Jpn. 159 (1) (2007) 39–45 . [21] D.L. Lee , Improvements of complex-valued Hopfield associative memory by

using generalized projection rules, IEEE Trans. Neural Netw. 17 (5) (2006)1341–1347 .

[22] M. Kobayashi , Pseudo-relaxation learning algorithm for complex-valued asso-ciative memory, Int. J. Neural Syst. 18 (02) (2008) 147–156 .

[23] R.S. Lee , A transient-chaotic autoassociative network (TCAN) based on Lee os-

cillators, IEEE Trans. Neural Netw. 15 (5) (2004) 1228–1243 . [24] A. Yoshida , Y. Osana , Chaotic complex-valued multidirectional associative

memory with variable scaling factor, in: Proceedings of the InternationalConference on Artificial Neural Networks and Machine Learning–ICANN 2011,

Springer, 2011, pp. 266–274 . [25] V. Gandhi , G. Prasad , D. Coyle , L. Behera , T. McGinnity , Quantum neural

network-based EEG filtering for a brain; computer interface, IEEE Trans. Neural

Netw. Learn. Syst. 25 (2) (2014) 278–288 . [26] G.G. Rigatos , S.G. Tzafestas , Quantum learning for neural associative memories,

Fuzzy Sets Syst. 157 (13) (2006) 1797–1813 . [27] N. Masuyama , C.K. Loo , Quantum-inspired complex-valued multidirectional as-

sociative memory, in: Proceedings of the IEEE International Joint Conferenceon Neural Networks (IJCNN), IEEE, 2015, pp. 1–8 .

[28] P. Ekman , An argument for basic emotions, Cognition &amp; Emotion 6 (3–4)(1992) 169–200 .

[29] J.A. Russell , A circumplex model of affect, J. Pers. Soc. Psychol. 39 (6) (1980)

1161 . [30] A. Ortony , The Cognitive Structure of Emotions, Cambridge University Press,

1990 . [31] J.A. Gray , The neuropsychology of emotion and personality, Zh Vyssh Nerv

Deiat lm l P. 37 (6) (1987) 1011–1024 . [32] J.A. Russell , L.F. Barrett , Core affect, prototypical emotional episodes, and other

things called emotion: dissecting the elephant, J. Pers. Soc Psychol. 76 (5)

(1999) 805 . [33] M. Johns , B.G. Silverman , How emotions and personality effect the utility of al-

ternative decisions: a terrorist target selection case study, Center Hum. Model.Simul. (2001) 10 .

[34] E. André, M. Klesen , P. Gebhard , S. Allen , T. Rist , Integrating models of person-ality and emotions into lifelike characters, in: Affective Interactions, Springer,

20 0 0, pp. 150–165 .

[35] G. Ball , J. Breese , Emotion and personality in a conversational agent, EmbodiedConversational Agents, MIT Press, Cambridge, MA , USA , 20 0 0, pp. 189–219 .

[36] P.T. Costa , R.R. McCrae , Normal personality assessment in clinical practice: theNEO personality inventory, Psychol. Assess. 4 (1) (1992) 5 .

[37] A. Mehrabian , Analysis of the big-five personality factors in terms of the PADtemperament model, Aust. J. Psychol. 48 (2) (1996) 86–92 .

[38] L.R. Goldberg , The development of markers for the big-five factor structure.,

Psychol. Assess. 4 (1) (1992) 26 .

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

[39] M. Schneider , J. Adamy , Towards modelling affect and emotions in autonomousagents with recurrent fuzzy systems, in: Proceedings of the 2014 IEEE Interna-

tional Conference on Systems, Man and Cybernetics (SMC), IEEE, 2014, pp. 31–38 .

[40] M.J. Han , C.H. Lin , K.T. Song , Robotic emotional expression generation basedon mood transition and personality model, IEEE Trans. Cybern. 43 (4) (2013)

1290–1303 . [41] J.A. Russell , M. Bullock , Multidimensional scaling of emotional facial expres-

sions: similarity from preschoolers to adults, J. Pers. Soc. Psychol. 48 (5) (1985)

1290 . [42] S.M. Smith , R.E. Petty , Personality moderators of mood congruency effects on

cognition: the role of self-esteem and negative mood regulation, J. Pers. Soc.Psychol. 68 (6) (1995) 1092 .

[43] M. Naveh-Benjamin , G.B. Maddox , P. Jones , S. Old , A. Kilb , The effects of emo-tional arousal and gender on the associative memory deficit of older adults,

Mem. Cognit. 40 (4) (2012) 551–566 .

44] A.M. Rosenthal-von der Pütten , N.C. Krämer , L. Hoffmann , S. Sobieraj , S.C. Eim-ler , An experimental study on emotional reactions towards a robot, Int. J. Soc.

Robotics 5 (1) (2013) 17–34 . [45] L.-F. Chen , Z.-T. Liu , M. Wu , M. Ding , F.-Y. Dong , K. Hirota , Emotion-age-gen-

der-nationality based intention understanding in human–robot interaction us-ing two-layer fuzzy support vector regression, Int. J. Soc. Robotics (2015) 1–21 .

[46] A. Hiolle , L. Canamero , M. Davila Ross , K.A. Bard , Eliciting caregiving behav-

ior in dyadic human-robot attachment-like interactions, ACM Trans. Interact.Intell. Syst. (TiiS) 2 (1) (2012) 3 .

[47] T. Rumbell , J. Barnden , S. Denham , T. Wennekers , Emotions in autonomousagents: comparative analysis of mechanisms and functions, Autonomous

Agents Multi-Agent Syst. 25 (1) (2012) 1–45 . [48] K. Itoh , H. Miwa , H. Takanobu , A. Takanishi , Application of neural network to

humanoid robots—development of co-associative memory model, Neural Netw.

18 (5) (2005) 666–673 . [49] W. Yi , W. Zhi liang , W. Wei , Research on associative memory models of emo-

tional robots, Adv. Mech. Eng. 6 (2014) 208153 . [50] R. Valverde Ibanez , M.U. Keysermann , P. Vargas , Emotional memories in au-

tonomous robots, in: Proceedings of the 23rd IEEE International Symposiumon Robot and Human Interactive Communication, IEEE, 2014, pp. 405–410 .

[51] N. Masuyama , M. Islam , M. Seera , C. Loo , Application of emotion affected as-

sociative memory based on mood congruency effects for a humanoid, NeuralComput. Appl. (2015) 1–16 .

[52] G.H. Bower , Mood and memory, Am. Psychol. 36 (2) (1981) 129 . [53] P. Lewis , H. Critchley , A. Smith , R. Dolan , Brain mechanisms for mood congru-

ent memory facilitation, Neuroimage 25 (4) (2005) 1214–1223 . [54] B.H. Pierce , E.A. Kensinger , Effects of emotion on associative recognition: va-

lence and retention interval matter, Emotion 11 (1) (2011) 139 .

[55] B.D. Murray , E.A. Kensinger , Age-related changes in associative memory foremotional and nonemotional integrative representations, Psychol. Aging 28 (4)

(2013) 969 . [56] G. Egidi , H.C. Nusbaum , Emotional language processing: how mood affects

integration processes during discourse comprehension, Brain Lang. 122 (3)(2012) 199–210 .

[57] N. Ravaja , J. Kätsyri , Suboptimal facial expression primes in textual media mes-sages: evidence for the affective congruency effect, Comput. Human Behav. 40

(2014) 64–77 .

[58] A.M. Collins , E.F. Loftus , A spreading-activation theory of semantic processing,Psychol. Rev. 82 (6) (1975) 407 .

[59] J. Rottenberg , J.J. Gross , When emotion goes wrong: realizing the promise ofaffective science, Clinical Psychol.: Sci. Pract. 10 (2) (2003) 227–232 .

[60] E. Ba ̧s ar , Brain Function and Oscillations: Volume II: Integrative Brain Function.Neurophysiology and Cognitive Processes, Springer Science & Business Media,

2012 .

[61] C.I. Hooker , L. Bruce , M. Fisher , S.C. Verosky , A. Miyakawa , S. Vinogradov , Neu-ral activity during emotion recognition after combined cognitive plus social

cognitive training in schizophrenia, Schizophrenia Res. 139 (1) (2012) 53–59 . [62] A. Mehrabian , Basic dimensions for a general psychological theory: Impli-

cations for personality, social, environmental, and developmental studies,Oelgeschlager, Gunn & Hain, Cambridge, MA, 1980 .

[63] R.R. McCrae , P.T. Costa , Validation of the five-factor model of personality across

instruments and observers, J. Pers. Soc. Psychol. 52 (1) (1987) 81 . [64] J.A. Russell , Core affect and the psychological construction of emotion, Psychol.

Rev. 110 (1) (2003) 145 . [65] Z. Esmaileyan , H. Marvi , Recognition of emotion in speech using variogram

based features, Malays. J. Comput. Sci. 27 (3) (2014) . [66] M. Nagamachi , A.M. Lokman , Innovations of Kansei Engineering, CRC Press,

2016 .

[67] P. Ekman , Basic Emotions, Handbook of Cognition and Emotion, 98, 1999,pp. 45–60 .

[68] N. Kubota , Y. Toda , Multimodal communication for human-friendly robot part-ners in informationally structured space, IEEE Trans. Syst. Man Cybern. Part C:

Appl. Rev. 42 (6) (2012) 1142–1151 . [69] A. Yorita , N. Kubota , Cognitive development in partner robots for information

support to elderly people, IEEE Trans. Autonomous Ment. Dev. 3 (1) (2011)

64–73 . [70] D. Bertsekas , Dynamic Programming: Deterministic and Stochastic Models,

Prentice-Hall, 1987 . [71] J.A. Anderson , Neurocomputing, vol. 2, MIT Press, 1993 .

otic emotional model with associative memory for human-robot

2017.0 6.0 69

N. Masuyama et al. / Neurocomputing 0 0 0 (2017) 1–13 13

ARTICLE IN PRESS

JID: NEUCOM [m5G; July 10, 2017;22:10 ]

[

[

[

[

[72] T. Back , D.B. Fogel , Z. Michalewicz , Handbook of Evolutionary Computation, IOPPublishing Ltd., 1997 .

[73] N. Kubota , T. Fukuda , Ecological model of virus-evolutionary genetic algorithm,Fundamenta Informaticae 37 (1, 2) (1999) 51–70 .

[74] W. Maass , Networks of spiking neurons: the third generation of neural net-work models, Neural Netw. 10 (9) (1997) 1659–1671 .

75] D. Cristinacce , T. Cootes , Automatic feature localisation with constrained localmodels, Pattern Recognit. 41 (10) (2008) 3054–3067 .

[76] M.N. Islam , C.K. Loo , Geometric feature-based facial emotion recognition using

two-stage fuzzy reasoning model, in: Neural Information Processing, Springer,2014, pp. 344–351 .

Please cite this article as: N. Masuyama et al., Personality affected rob

interaction, Neurocomputing (2017), http://dx.doi.org/10.1016/j.neucom.

[77] J. Nuevo , L.M. Bergasa , P. Jiménez , RSMAT: robust simultaneous modeling andtracking, Pattern Recognit. Lett. 31 (16) (2010) 2455–2463 .

78] F. Dawood , C.K. Loo , W.H. Chin , Incremental on-line learning of human mo-tion using gaussian adaptive resonance hidden Markov model, in: Proceedings

of the 2013 International Joint Conference on Neural Networks (IJCNN), IEEE,2013, pp. 1–7 .

79] A. Lee , T. Kawahara , Recent development of open-source speech recognitionengine Julius, in: Proceedings: APSIPA ASC, 2009, pp. 131–137 .

80] E. Eidinger , R. Enbar , T. Hassner , Age and gender estimation of unfiltered faces,

IEEE Trans. Inf. Forensics Secur. 9 (12) (2014) 2170–2179 .

otic emotional model with associative memory for human-robot

2017.0 6.0 69