Expressive Gestures for NAO
NAO TechDay, 13/06/2012, Paris
Le Quoc Anh - Catherine PelachaudCNRS, LTCI, Telecom-ParisTech, France
Le Quoc Anh & Catherine Pelachaudpage 2
Objectives
Generate communicative gestures for Nao robot• Integrated within an existing platform for virtual agent• Nonverbal behaviors described symbolically• Synchronization (gestures and speech)• Expressivity of gestures
GVLEX project (Gesture & Voice for Expressive Reading)• Robot tells a story expressively.• Partners : LIMSI (linguistic aspects), Aldebaran (robotics),
Acapela (speech synthesis), Telecom ParisTech (expressive gestures)
NAO TechDay 2012
Le Quoc Anh & Catherine Pelachaud
State of the art
Several initiatives recently: • Salem et Kopp (2012): robot ASIMO, the virtual
framework MAX, gesture description with MURML.
• Aaron Holroyd et Charles Rich (2011): robot Melvin, motion scripts with BML, simple gestures, feedback to synchronize gestures and speech
• Ng-Thow-Hing et al. (2010): robot ASIMO, gestures selection, synchronization between gestures and speech.
• Nozawa et al. (2006): motion scripts with MPML-HP, robot HOAP-1
Our system: Focus on expressivity and synchronization of gestures with speech using a common platform for Greta and for Nao
page 3 NAO TechDay 2012
Le Quoc Anh & Catherine Pelachaud
Steps
1. Build a library of gestures from a corpus of storytelling video: the gesture shapes should not be identical (between the human, virtual agent, robot) but they have to convey the same meaning.
2. Use the GRETA system to generate gestures for Nao• Following the SAIBA framework
- Two representation languages: FML (Function Markup Language) and BML (Behavior Markup Language)
- Three separated modules: plan communicative intents, select and plan gestures, and realize gestures
page 4 NAO TechDay 2012
TextIntent Planning
Behavior Planning
Behavior Realizer
FML
BML
BML Behavior Realizer
GRETA System
Le Quoc Anh & Catherine Pelachaud
Global diagram
page 5 NAO TechDay 2012
FML BML KEYFRAMES
LEXICON
Gesture Selection
Planification of gesture duration
Synchronisation with AI speech
Modification of gesture expressivity
Le Quoc Anh & Catherine Pelachaud
Gesture Animation Planning
Synchronization with speech• The stroke phase coincides or precedes emphasized words of the speech
(McNeill, 1992)• Gesture stroke phase timing specified by synch points
Expressivity of gestures• The same prototype but different animations• Parameters:
- Spatial Extent (SPC): Amplitude of movement- Temporal Extent (TMP): Speed of movement- Power (PWR): Acceleration of movement- Repetition (REP): Number of Stroke times- Fluidity (FLD): Smoothness and Continuity- Stiffness (STF): Tension/Flexibility
page 6 NAO TechDay 2012
Le Quoc Anh & Catherine Pelachaud
Example
page 7 NAO TechDay 2012
keyframe1]<phase="preparation", start-time=“Start", end-time="Ready", description of stroke-start's position>
keyframe[2] <phase="stroke", start-time="Stroke-start", end-time="Stroke-end", description of stroke-end's position>
keyframe[3]<phase="retraction", start-time="Relax", end-time="End", description of rest position>
<bml> <speech id="s1" start="0.0“ \vce=speaker=Antoine\ \spd=180\ Et le troisième dit tristement:
\vce=speaker=AntoineSad\ \spd=90\ \pau=200\<tm id="tm1"/>J'ai très faim!</speech><gesture id="beat_hungry" start="s1:tm1" end=“start+1.5" stroke="0.5"><FLD.value>0</FLD.value><OAC.value>0</OAC.value><PWR.value>-1.0</PWR.value><REP.value>0</REP.value><SPC.value>-0.3</SPC.value><TMP.value>-0.2</TMP.value></gesture></bml>
<gesture id=“beat_hungry” min_time="1.0" ><phase type="STROKE-START“> <hand side=“BOTH">
<verticalLocation>YCC</verticalLocation><horizontalLocation>XCenter</horizontalLocation><distanceLocation>Zmiddle</distanceLocation><handShape>OPENHAND</handShape><palmOrientation>INWARD</palmOrientation></hand></phase><phase type="STROKE-END“ > <hand side=“BOTH"> <verticalLocation>YLowerEP</verticalLocation> <horizontalLocation>XCenter</horizontalLocation> <distanceLocation>ZNear</distanceLocation> <handShape>OPEN</handShape> <palmOrientation>INWARD</palmOrientation> </hand></phase></gesture>
Le Quoc Anh & Catherine Pelachaud
Compilation
page 8 NAO TechDay 2012
BML Realizer
BML Realizer
API.AngleInterpolation (joints, values,times)
Send timed key-positions to the robot using available APIs
Animation is obtained by interpolating between joint values with robot built-in proprietary procedures.
Le Quoc Anh & Catherine Pelachaud
Demo « Trois petits morceaux de nuit »
page 9 NAO TechDay 2012
Le Quoc Anh & Catherine Pelachaud
Conclusion
Conclusion• A gesture model is designed, implemented for Nao while taking into
account physical constraints of the robot.• Common platform for both virtual agent and robot• Expressivity model
Future work• Create gestures with different emotional colour and personal style • Validate the model through perceptive evaluations
page 10 NAO TechDay 2012
Le Quoc Anh & Catherine Pelachaud
Acknowledgment
page 11 NAO TechDay 2012
This work has been funded by the ANR GVLEX project It is supported from members of the laboratory TSI,
Telecom-ParisTech
Top Related