Slide 1

53
Art Graesser Psychology, Computer Science, & the Institute for Intelligent Systems Learning with Conversational Agents

Transcript of Slide 1

Page 1: Slide 1

Art GraesserPsychology, Computer Science, & the Institute for Intelligent Systems

Learning with Conversational Agents

Page 2: Slide 1

Learning, Language, and Communication Technologies

• Tutoring

• Language & discourse

• Robotic androids

Page 3: Slide 1

Overview

• Why conversational agents?• AutoTutor

– Learning – Motivation– Emotions

• Other learning environments with agents

Page 4: Slide 1

Exploring a Sea of Animated Conversational Agents

SI Agent

BEAT LeonardoPKD Android Casey

iSTART TLTS Spark MRE

AutoTutor Adele STEVE Carmen

iMAP

SI Agent Laura

Page 5: Slide 1

Why conversational agents?

• Social presence

• The universal interface

• Simulate and model social and conversational interactions

• Precise control over the language and discourse (unlike humans)

• Understand psychological and pedagogical mechanisms by building them

Page 6: Slide 1

Agent Configurations

• Dialogs– Tutor agent– Teachable agent

• Trialogs– Vicarious learning from agent dyad– Tutor-agent, student-agent, human

• Quadralogs– Human interacting with student-agent

while observing an agent dyad

Page 7: Slide 1

Art Graesser (PI)Zhiqiang CaiDon FranceschettiStan FranklinBarry GholsonXiangen HuDavid LinMax LouwerseAndrew OlneyNatalie PersonVasile Rus

• Learn by conversation in natural language

D’Mello, S.K., Picard, R., & Graesser, A.C. (2007). Toward an affect-sensitive AutoTutor. IEEE Intelligent Systems, 22, 53-61.

Graesser, A.C., Jackson, G.T., & McDaniel, B. (2007). AutoTutor holds conversations with learners that are responsive to their cognitive and emotional states. Educational Technology, 47, 19-22.

VanLehn, K., Graesser, A.C., Jackson, G.T., Jordan, P., Olney, A., & Rose, C.P. (2007). When are tutorial dialogues more effective than reading? Cognitive Science, 31, 3-62.

What is AutoTutor?

Page 8: Slide 1

Talking headGesturesSynthesized speech

Presentation of the question/problem

Dialog history with tutor turnsstudent turns

Student input (answers, comments, questions)

AutoTutor

Page 9: Slide 1

Expectations and misconceptions in Sun & Earth problem

EXPECTATIONS• The sun exerts a gravitational force on the

earth. • The earth exerts a gravitational force on the

sun. • The two forces are a third-law pair.• The magnitudes of the two forces are the same.MISCONCEPTIONS• Only the larger object exerts a force. • The force of earth on sun is less than that of

the sun on earth.

Page 10: Slide 1

When a car without headrests on the seats is struck from behind, the passengers often suffer neck injuries. Why do passengers get neck injuries in this situation?

QuestionHead

Simulation

Parameter Controls Describe what

happens

Page 11: Slide 1

What does AutoTutor do? Asks questions and presents problems Evaluates meaning of the learner’s answers Gives feedback on answers Display facial expressions with emotions Nods and gestures Hints Prompts for specific information Adds information that is missed Corrects bugs and misconceptions Answers student questions Guides the learner in interactive 3-D simulations

Page 12: Slide 1
Page 13: Slide 1

Expectation and Misconception-Tailored Dialog:

Pervasive in AutoTutor & human tutors • Tutor asks question that requires explanatory reasoning

• Student answers with fragments of information, distributed over multiple turns

• Tutor analyzes the fragments of the explanation– Compares to a list of expected good idea units– Compares to a list of expected errors and misconceptions

• Tutor posts goals & performs dialog acts to improve explanation– Fills in missing expected good idea units (one at a time)– Corrects expected errors & misconceptions (immediately)

• Tutor handles periodic sub-dialogues– Student questions– Student meta-communicative acts (e.g.,

What did you say?)

Page 14: Slide 1

5-step Tutoring Frame(Graesser & Person, 1994)

1. Tutor asks difficult question2. Initial student answer3. Tutor short feedback (+, -, neutral)4. Tutor and student collaboratively

improve the answer via a multi-turn dialogue

5. Tutor confirmation of student understanding(often by asking students if they understand)

Page 15: Slide 1

Dialog Moves During Steps 2-4

Positive immediate feedback: “Yeah” “Right!” Neutral immediate feedback: “Okay” “Uh huh” Negative immediate feedback: “No” “Not quite”

Pump for more information: “What else?” Hint: “What about the earth’s gravity?” Prompt for specific information: “The earth exerts a gravitational

force on what?” Assert: “The earth exerts a gravitational force on the sun.”

Correct: “The smaller object also exerts a force. ” Repeat: “So, once again, …” Summarize: “So to recap,…” Answer student question:

Page 16: Slide 1

Managing One AutoTutor Turn

• Short feedback on the student’s previous turn

• Advance the dialog by one or more dialog moves that are connected by discourse markers

• End turn with a signal that transfers the floor to the student– Question

– Prompting hand gesture

– Head/gaze signal

Page 17: Slide 1

What about Learning Gains?

Page 18: Slide 1

LEARNING GAINS OF TUTORS

(effect sizes) .42 Unskilled human tutors

(Cohen, Kulik, & Kulik, 1982)

.80 AutoTutor (14 experiments)(Graesser and colleagues)

1.00 Intelligent tutoring systems

PACT (Anderson, Corbett, Aleven, Koedinger)Andes, Atlas (VanLehn)Diagnoser (Hunt, Minstrell)Sherlock (Lesgold)

(2.00??) Skilled human tutors (Bloom, 1987)

Page 19: Slide 1

Learning Conceptual Physics(VanLehn, Graesser et al., 2007)

Four conditions:

• Read Nothing

• Read Textbook

• AutoTutor

• Human Tutor0.5 0.6 0.7 0.8

Adjusted post-test scores

Page 20: Slide 1

How does AutoTutor compare to comparison conditions on tests of deep comprehension?

• 0.80 sigma compared to pretest, doing nothing, and reading the textbook

• 0.22 compared to reading relevant textbook segments

• 0.07 compared to reading succinct script• 0.13 compared to AutoTutor delivering speech

acts in print• 0.08 compared to humans in computer-mediated

conversation• -0.20 compared to AutoTutor enhanced with

interactive 3D simulation • ZONE OF PROXIMAL DEVELOPMENT

Page 21: Slide 1

Force equals mass times acceleration

Pretest

Essay

Pretest

MC

Training Posttest Essay

Posttest

MCAll-or-none

Learning

X00X X0XXXX0XXX XXX0XXX0XX

XX0X0X

X0XX0X1XX1 X1X1 XX1XX1XXXX

X11XXXXXX1

XX1XXX

Variable Learning

X10X X0XXXX0XXX XXX1XXX0XX

XX0X0X

X0XX1X1XX0 X1X0 XX1XX0XXXX

X11XXXXXX0

XX1XXX

No Learning

X00X X0XXXX1XXX XXX0XXX0XX

XX1X0X

X0XX0X1XX0 X1X0 XX0XX0XXXX

X10XXXXXX0

XX1XXX

Refresher Learning

X00X X0XXXX1XXX XXX1XXX1XX

XX1X1X

X1XX1X1XX1 X1X1 XX1XX1XXXX

X11XXXXXX1

XX1XXX

Page 22: Slide 1

Increasing motivation during learning

* Rude tutor* Feedback

Page 23: Slide 1

Example DialogTutor: Okay, let's begin. Suppose a runner is running in a straight line at constant speed, and the runner throws a pumpkin straight up. Where will the pumpkin land? Explain why.

Student: The pumpkin will land behind the runner

Tutor: Well, what can you say about the pumpkin's vertical velocity in relation to gravity?

Student: They are equal

Tutor: Alright, I am sure you can get this. The direction in which a force is being applied to the pumpkin is what?

Student: Down

Tutor: Excellent! Is vertical. Let's move on. After release the only force on the pumpkin is gravity, the horizontal velocity of the pumpkin is not affected. What can you say about the pumpkin's motion after its release?

Student: It is constantly moving forward

Page 24: Slide 1

Example Dialog with RudeTutorTutor: Okay, let's begin. Suppose a runner is running in a straight line at constant speed, and the runner throws a pumpkin straight up. Where will the pumpkin land? Explain why.

Student: The pumpkin will land behind the runner

Tutor: What a pathetically incorrect answer! So what can you say about the pumpkin's vertical velocity in relation to gravity?

Student: They are equal

Tutor: Ouch, you must have been daydreaming in your high school football physics course. I doubt you can get this next question but let me give it a try. The direction in which a force is being applied to the pumpkin is what?

Student: Down

Tutor: Let me hold my nose and grant you this one. It’s vertical. Let's move on…

Page 25: Slide 1

Content versus Progress Feedback

• Content Feedback• Qualitative information about the subject matter. • Local: Highlighting important words • Global: Summary of ideal answer

• Progress Feedback• Quantitative assessment of the student’s advancement

through the material being covered • How far the student has come and how much farther

they have to go?• Local: Progress on individual expectations.• Global: Points scored in session.

Page 26: Slide 1

Screen shot with additions

Page 27: Slide 1

Feedback Experiment(Tanner Jackson dissertation)

• Pretest– 26 conceptual physics questions (counterbalanced)

• Interact with a version of AutoTutor– All Feedback– Progress Only– Content Only– No Feedback

• Posttest– 26 conceptual physics questions (counterbalanced)

Page 28: Slide 1

Adjusted Posttest Performance

0.3

0.35

0.4

0.45

0.5

0.55

0.6

AllFeedback

ContentOnly

ProgressOnly

NoFeedback

Page 29: Slide 1

The Tutor Held my Interest

1

2

3

4

5

AllFeedback

ContentOnly

ProgressOnly

NoFeedback

Rated from 1=Strongly Disagree to 6=Strongly Agree

Page 30: Slide 1

Conclusions about Feedback Study

• Content Matters: Content feedback improved learning gains.

• Progress feedback had little to no effect on either learning or motivation.

• Negative relationship between liking and learning– Content feedback provides an increase in

learning gains, but a decrease in user satisfaction.

Page 31: Slide 1

Emotions during Learning

Page 32: Slide 1

Emotions and Complex Learning

• Emotions (affective states) are inextricably bound to deep learning.

• Negative emotions are:– confusion irritation

frustration anger rage• Positive emotions are:

– engagement flow delight excitement eureka

Page 33: Slide 1

Theories of Emotions relevant to Complex Learning

• Goal Theory, with achievements and interruptions (Dweck, 2002; Mandler, 1976; Stein & Hernandez,2007)

• Cognitive Disequilibrium Model and confusion (Graesser & Olde, 2003; Piaget, 1952)

• Yerkes-Dodson law on performance, arousal, and task complexity

• Flow Theory and consciousness (Csikszentmihaly, 1990) • Appraisal theory (Frijda, 1986; Laird, 2007; Scherer, 2006) • Valence-intensity model, core affective framework, &

folklore (Barrett, 2006; Russell, 2003)• Academic risk theory (Meyer & Turner, 2006) • Lepper’s analysis that promotes challenge, control,

confidence and curiosity • Kort, Reilly, Picard Model (2001) • Multiple zones of affect and cognition

Page 34: Slide 1

Emotions During Learning

Boredom (23%)

Surprise(4%)

Frustration(16%)

Flow(28%)

Delight(4%)

Confusion(25%)

Page 35: Slide 1
Page 36: Slide 1

Methods of Identifying Emotions

• Expert observation• Emote aloud protocols• After-action judgments by:

– Self (learner)– Peer– Expert judges– Teachers

• Sensing channels– Dialogue history– Speech parameters– Facial actions– Posture

Page 37: Slide 1

Inter-judge Reliability of Trained Judges

Study Kappa Emotions

Our research .49 Confusion, frustration…Litman & Forbes-Riley (2004) .40 Positive, Neutral NegativeAng, Dhillon, Krupski, Shriberg, & Stolcke (2002) .47 Frustration, AnnoyanceShafran, Riley, & Mohri (2003) .32-.42 -

Narayanan .45-.48 Negative Emotions

Page 38: Slide 1

Emotion Sensors and Channels

Dialogue

Face

Posture

Speech

Page 39: Slide 1

Classification Accuracy[Overall]

Page 40: Slide 1

Bayesian Diagnosticity

Page 41: Slide 1

Conclusions about Automated Emotion Classification

• Dialogue, face, and posture can classify as good as trained judges.

• Fusion of channels typically involves redundancy.– But superadditivity in the case of boredom

• Different emotions are expressed in different channels.

Page 42: Slide 1

How might AutoTutor be Responsive to the Learner’s Affect States

1) If learner is frustrated, then AutoTutor gives a hint.

2) If bored, then some engaging razzle dazzle

3) If flow/absorbed, then lay low

4) If confused, then intelligently manage optimal confusion

Page 43: Slide 1

Emotion Sequencing(D’Mello, Taylor, & Graesser)

FRUSTRATIONCONFUSION

BOREDOM FLOW

Page 44: Slide 1

Beyond

AutoTutor

Page 45: Slide 1

Memphis Agent Environments

Good Job!

AutoTutor iSTART MetaTutor

HURA Advisor

Tutor Agent

student agent

ARIESiDRIVE

Page 46: Slide 1

ARIES

Agent

Trialogs

Page 47: Slide 1
Page 48: Slide 1

ARIESAcquiring Research Investigative and Evaluative Skills

• Scientific reasoning

• Electronic textbook with periodic multiple choice questions

• Diagnostic questions

• Critique experiments and newspaper clips

• Game context with aliens

• AutotTutor trialogs with tutor and student agents

Page 49: Slide 1
Page 50: Slide 1
Page 51: Slide 1
Page 52: Slide 1

Our Next Challenge

How do we design dialogs, trialogs, and quadralogs of agents in a science of learning that is sensitive to social, motivational, affective, and cognitive (SMAC) mechanisms?

Page 53: Slide 1

?