17
Expressive Gesture Model for a Humanoid Robot Doctoral Consortium, ACII 2011, Memphis, USA Le Quoc Anh - Catherine Pelachaud CNRS, LTCI Telecom-ParisTech, France

Affective Computing and Intelligent Interaction (ACII 2011)

  • Upload
    le-anh

  • View
    339

  • Download
    0

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: Affective Computing and Intelligent Interaction (ACII 2011)

Expressive Gesture Model for a Humanoid Robot

Doctoral Consortium, ACII 2011, Memphis, USA

Le Quoc Anh - Catherine PelachaudCNRS, LTCI

Telecom-ParisTech, France

Page 2: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaudpage 2

Objectives

Generate communicative gestures for Nao robot• Integrated within an existing platform (GRETA)• Scripts with a symbolic language• Synchronization (gestures and speech)• Expressivity of gestures

GVLEX project (Gesture & Voice for Expressive Reading)• Robot tells a story expressively.• Partners : LIMSI (linguistic aspects), Aldebaran (robotics),

Acapela (speech synthesis), Telecom ParisTech (expressive gestures)

ACII 2011

Page 3: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

State of the art

Several initiatives recently: Salem et al. (2010), Holroyd et al. (2011), Ng-Thow-Hing et al. (2010), Shi et al. (2010), Nozawa et al. (2006).• Motion scripts with MURML, BML, MPML-HR, etc• Adapt gestures to speech (for synchronization)• Mechanism for receiving and processing feedback from the

robot• Gesture animation: no expressivity

Our system: Focus on expressivity and synchronization of gestures with speech

page 3 ACII 2011

Page 4: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Our methodology

Gesture describes with a symbolic language (BML)

Gestural expressivity (amplitude, fluidity, power, repetition, speed, stiffness,…)

Elaboration of gestures from a storytelling video corpus (Martin et al., 2009)

Execution of the animation by translating into joint values

page 4 ACII 2011

Page 5: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Problem and Solution

Using a virtual agent framework to control a physical robot raises several problems: • Different degrees of freedom• Limit of movement space and speed

Solution: • Use the same representation language

- same algorithm for selecting and planning gestures - different algorithm for creating the animation

• Elaborate one gesture repository for the robot and another one for the Greta agent

• Gesture movement space and velocity specification

page 5 ACII 2011

Page 6: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

System Overview

Behavior Planning: selects and plans gestures. Behavior Realizer: schedules and creates gesture

animations.

page 6

Lexicon for Nao

Behavior Planning Behavior Realizer

Lexicons

Lexicon for Greta

BMLFML

Animation Computation

Joint Values Instantiation

Interpolation Module

Animation Production

Symbolic Description of Gesture Phases

Temporal Infomation

WAV file

ACII 2011

Page 7: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Base of gesture elaboration in lexicons

• Annotation of gestures from a storytelling video corpus from Martin et al. (2009)• From gesture annotation to entries in Nao lexicon• BML description of each gesture:

page 7 ACII 2011

Gesture Elaboration

Gesture->Phases->Hands (wrist position, palm orientation, shape,...) Only stroke phases are specified. Other phases will be generated

automatically by the system

Page 8: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Synchronization of gestures with speech

The stroke phase coincides or precedes emphasized words of the speech (McNeill, 1992)

Gesture stroke phase timing specified by synch points

page 8 ACII 2011

Page 9: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Synchronization of gestures with speech

Algorithm• Compute preparation phase• Delete gesture if not enough time (strokeEnd(i-1) > strokeStart(i)+duration)• Add a hold phase to fit gesture planned duration• Coarticulation between several gestures

- If enough time, retraction phase (ie go back to rest position)

- Otherwise, go from end of stroke to preparation phase of next gesture

page 9 ACII 2011

Start

S-start S-end S-start S-end

end

Start end Start end

Page 10: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Gesture expressivity

Spatial Extent (SPC): Amplitude of movement

Temporal Extent (TMP): Speed of movement

Power (PWR): Acceleration of movement

Fluidity (FLD): Smoothness and Continuity

Repetition (REP): Number of Stroke times

Stiffness (STF): Tension/Flexibility

page 10 ACII 2011

Page 11: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Gesture expressivity

Spatial Extent (SPC): Amplitude of movement

Temporal Extent (TMP): Speed of movement

Power (PWR): Acceleration of movement

Fluidity (FLD): Smoothness and Continuity

Repetition (REP): Number of Stroke times

Stiffness (STF): Tension/Flexibility

page 11 ACII 2011

Page 12: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Animation Computation & Execution

Schedule and plan gestures phases Compute expressivity parameters Translate symbolic descriptions into joint values Execute animation• Send timed key-positions to the robot using available

APIs • Animation is obtained by interpolating between joint

values with robot built-in proprietary procedures.

page 12 ACII 2011

Page 13: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Example

page 13 ACII 2011

animation[1]<phase="preparation", start-time=“Start", end-time="Ready", description of stroke-start's position>

animation[2] <phase="stroke", start-time="Stroke-start", end-time="Stroke-end", description of stroke-end's position>

animation[3]<phase="retraction", start-time="Relax", end-time="End", description of rest position>

<bml>   <speech id="s1" start="0.0“ \vce=speaker=Antoine\ \spd=180\ Et le troisième dit tristement:

\vce=speaker=AntoineSad\ \spd=90\ \pau=200\<tm id="tm1"/>J'ai très faim!</speech><gesture id="beat_hungry" start="s1:tm1" end=“start+1.5" stroke="0.5"><FLD.value>0</FLD.value><OAC.value>0</OAC.value><PWR.value>-1.0</PWR.value><REP.value>0</REP.value><SPC.value>-0.3</SPC.value><TMP.value>-0.2</TMP.value></gesture></bml>

<gesture id=“beat_hungry” min_time="1.0" ><phase type="STROKE-START“> <hand side=“BOTH">

<verticalLocation>YCC</verticalLocation><horizontalLocation>XCenter</horizontalLocation><distanceLocation>Zmiddle</distanceLocation><handShape>OPENHAND</handShape><palmOrientation>INWARD</palmOrientation></hand></phase><phase type="STROKE-END“ > <hand side=“BOTH"> <verticalLocation>YLowerEP</verticalLocation> <horizontalLocation>XCenter</horizontalLocation> <distanceLocation>ZNear</distanceLocation> <handShape>OPEN</handShape> <palmOrientation>INWARD</palmOrientation> </hand></phase></gesture>

Page 14: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Example

page 14 ACII 2011

Page 15: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Conclusion and future work Conclusion• A gesture model is designed, implemented for Nao while taking into

account physical constraints of the robot.• Common platform for both virtual agent and robot• Expressivity model • Allows us to create gestures with different affective states and

personal style Future work• Build two repositories of gestures, one for Greta and another one for

NAO• Improve expressivity and synch of gestures with speech• Receive and process feedback from the robot• Validate the model through perceptive evaluations

page 15 ACII 2011

Page 16: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Acknowledgment

page 16 ACII 2011

This work has been funded by the ANR GVLEX project It is supported from members of the laboratory TSI,

Telecom-ParisTech

Page 17: Affective Computing and Intelligent Interaction (ACII 2011)

Le Quoc Anh & Catherine Pelachaud

Gesture Specification Gesture->Phases->Hands (wrist position, palm orientation, shape,...) Only stroke phases are specified. Other phases will be generated automatically by the system

page 17

1. <gesture id="greeting" category="ICONIC" min_time="1.0“ hand="RIGHT">2. <phase type="STROKE-START" twohand="ASSYMMETRIC“ > 3. <hand side="RIGHT"> 4. <vertical_location>YUpperPeriphery</vertical_location> 5. <horizontal_location>XPeriphery</horizontal_location> 6. <location_distance>ZNear</location_distance>7. <hand_shape>OPEN</handshape>8. <palm_orientation>AWAY</palm_orientation>9. </hand>10. </phase>11. <phase type="STROKE-END" twohand="ASSYMMETRIC">12. <hand side="RIGHT">13. <vertical_location>YUpperPeriphery</vertical_location> 14. <horizontal_location>XExtremePeriphery</horizontal_location>15. <location_distance>ZNear</location_distance> 16. <hand_shape>OPEN</handshape>17. <palm_orientation>AWAY</palm_orientation>18. </hand>19.</phase>20.</gesture> An example for greeting gesture

ACII 2011