ILMDA: An Intelligent Agent that Learns to Deliver Learning Materials Leen-Kiat Soh Department of...

28
ILMDA: An Intelligent Agent that Learns to Deliver Learning Materials Leen-Kiat Soh Department of Computer Science and Engineering University of Nebraska [email protected]
  • date post

    20-Dec-2015
  • Category

    Documents

  • view

    216
  • download

    0

Transcript of ILMDA: An Intelligent Agent that Learns to Deliver Learning Materials Leen-Kiat Soh Department of...

ILMDA: An Intelligent Agent that Learns to Deliver

Learning Materials

Leen-Kiat SohDepartment of Computer Science

and EngineeringUniversity of Nebraska

[email protected]

What is an Agent?

• An agent is an entity that takes sensory input from its environment, makes autonomous decisions, and carries out actions that affect the environment– A thermostat is an agent– A calculator is not an agent

Environment

sensoryinput

outputactions

Agent

think!

What is an Intelligent Agent?

• An intelligent agent is one that is capable of flexible autonomous actions in order to meet its design objectives, where flexibility means:

– Reactivity: agents are able to perceive their environment, and respond in a timely fashion to changes that occur in order to satisfy their design objectives

– Pro-activeness: agents are able to exhibit goal-directed behavior by taking the initiative in order to satisfy their design objectives

– Social ability: agents are capable of interacting with other agents (and possibly humans) in order to satisfy their design objectives

(Wooldridge and Jennings 1995)

What is an Intelligent Agent? Cont’d

• Machine Learning in AI says

• Agents that learn are intelligent– Individual learning– Multiagent system learning

• Not all agents are intelligent!

The acquisition of new knowledge and motor and cognitive skills and the incorporation of the acquired knowledge and skills in future system activities, provided that this acquisition and incorporation is conducted by the system itself and leads to an improvement in its performance.

Environment?

• Inaccessible vs. accessible – Incomplete vs. complete data

• Deterministic vs. non-deterministic– Certainty vs. uncertainty

• Episodic vs. non-episodic– Each episode is independent or not

• Static vs. dynamic– Remain unchanged except by the performance

of actions by the agent?

• Discrete vs. continuous– “Chess game” vs. “taxi driving”

(Russell and Norvig 1995)

Agent

Why Agents?

• If the system-to-be-built has, during the execution of the system

– Incomplete data– Uncertainty in the assessment/interaction of its

environment– Inter-dependent episodes of events– No full control over the events in the environment– An “open world”, instead of a “closed world”

• In other words, agents are used when you need to build a system that is adaptive to an uncertain, dynamic, and at times unexpected environment– So you can make full use of the autonomous property of

an agent

Why does a person hire an agent?

Education Systems

• Not all computer-aided learning and teaching systems are agent-based, not all are intelligent

• Systems related to agents focus on two areas:– Intelligent User Interface– Tutors

Educational Applications

• Intelligent User Interfaces deliver information to users in a variety of presentation schemes and interaction schemes

– SAM (Cassell et al. 2000) encourages young children to engage in storytelling by taking turns playing with a figurine and a toy castle in SAM’s virtual reality

– GrandChair (Smith 2000) appears to be a young child in a rocking chair to listen to a grandparents’ family stories

– AIA, PIP, IMP, and Magic Monitor (André and Rist 2001): examples of controlling agent behavior in intelligent interfaces

– Herman the Bug (in Design-A-Plant), Cosmo the Internet Advisor, and WhizLow (in a CPU city) (Lester et al. 2002)

Educational Applications

• Intelligent Tutoring Systems provide feedback and support for student learning

– PACT (Koedinger et al. 1997): algebra, geometry, and computer languages

– ANDES (Gertner and VanLehn 2000; VanLehn 1996): physics

– SHERLOCK (Lesgold et al. 1992): electronics

– ATLAS (VanLehn et al. 2000)

– AutoTutor (Graesser et al. 2001): introductory course in computer literacy

– EVELYN Reading Coach and EMILY Reading Coach (Mostow and Aist 2002): Project LISTEN

• Help students read by listening to children read aloud

ILMDA:Intelligent

Learning Materials Delivery

Agent

OVERVIEW

ILMDAReasoning

studentComputer& GUI

databaselectures

Historical profile,Real-time behavior Parametric profile of

student and environment

Retrieval instructionsProfile updatesStatistics updates

Timely deliveryof examples & exercise problems

ExamplesExercise problemsStatistics

ILMDA Agent

OVERVIEW 2

Quit/Failed Login

SuccessfulLogin

Pick tutorial

panel

Selected tutorial

TutorialpanelQuit

Example

panelQuit

Example wanted

Review

Quit

Problem wanted

Review

Problem

panel

New user

New userpanel

loginOVERVIEW 2

METHODOLOGY

• An intelligent agent that – interacts with its environment (through its

GUI), – makes decisions autonomously using case-

based reasoning

– learns to improve its performance • A portable design with a Java-based GUI

frontend• A flexible design with a mySQL database

backend

Case-Based Reasoning

• Case-based reasoning (CBR)– There is a casebase with numerous cases,

each case dealing with a particular problem with a particular solution

• A case has a problem, a solution, and an outcome

– The useful part occurs when we encounter a new problem – how do we find a solution?

• From scratch?• Or perhaps, use a previous solution?

CBR 2

• How to use a previous solution?

problem

Retrieve

proposed solution

Reuse

confirmed solution

Revise

Retaincasebase

CBR 3

• How to use a previous solution?

Changed solution;Hopefully it works

Difference?

Adaptation old, working solution

new problemOld, but very similar problem

CBR in ILMDA

• Problem– Student background, student dynamic activity,

profile of learning materials

• Solution– Parameters specifying a particular type of

examples or problems to deliver to the students

• Outcome– Whether the student quits, finishes the entire

learning material, or answers the problem correctly

CBR in ILMDA 2

• Problem Description– Student Background

• GPA, majors, interests, motivation, self-efficacy, etc.

– Student Dynamic Activity• Time spent on a tutorial, number of mouse clicks,

number of times going back and forth from the examples to the tutorial, etc.

– Learning Material Profile• Tutorials, examples, problems• Difficulty level, length, number of times viewed, etc.

CBR in ILMDA 3

• Solution Parameters– Number of times viewed– Difficulty level– Average use time in seconds– Average mouse clicks– Length in characters– Content– Bloom’s category– Scaffolding level (HIGHLIGHT, HINT,

REFERENCE, ELABORATION)

Content

• Example– An example for the topic on File I/O

Let us say that you are keeping track of your baseball team’s player stats using a Java program, and you want to be able to save these stats to a file. Instead of worrying about using high-level File I/O and (HIGHLIGHT)keeping track of what order things are read in(/HIGHLIGHT), we can use Object I/O and store things much easier! (HINT)Why would using Object I/O make storing things much easier?(/HINT) (HINT)What are the differences between high-level File I/O and Object I/O?(/HINT) (REFERENCE)If you wonder what the differences between high-level File I/O and Object I/O are, please check out the Tutorial page, under the headings of High-Level File I/O and Object I/O!(/REFERENCE)

Content 2

• Example– A problem

for topic File I/O

Difficulty: 0.3Blooms: KnowledgeContent: Generic

Which of the following must be done in order to use (HIGHLIGHT)File Input/Output(/HIGHLIGHT) in Java: (HINT)Think about what you need to use File I/O objects, what reading from or writing to a file may entail, and what you need to do to conclude the process.(HINT)(REFERENCE)To refresh, check out the Tutorial page, under the File Object heading.(/REFERENCE) (REFERENCE)(Check out the Example page.)(/REFERENCE) (ELABORATION)Think about this: first you need to be able to access IO-related classes and packages. Then when you set up your program, it is possible that a file is not there, or the program does not have permission to read/write, and thus the program must be able to handle those scenarios. Finally, to prevent data loss, you need to make sure that the files are closed.(/ELABORATION)

a. Import the java.io librariesb. Catch or throw Input/Output Exceptionsc. Close the files when we are done using them

CBR in ILMDA 4

• Outcomes– Level 0: Did the student quit midway through

an example/problem? Did the student fail to answer the problem correctly?

– Level 1: Do students with exposure to ILMDA perform better, based on a control-treatment study on post-tests?

– Level 2: Are agents with machine learning capabilities more efficient/effective?

– Level 3: Are agents able to provide efficient “customized” paths for students of different profiles?

Preliminary Results

• Deployment Fall 2004– CSCE155 (CS1)– Diverse students (majors, aptitude)– 3-hour lectures, 2-hour labs, weekly– Four lab sections

• 20-25 students each

– Used ILMDA in five labs• File I/O, Event-Driven Programming, Exceptions,

Inheritance/Polymorphism, and Recursion• Each topic has a tutorial, a set of 3-4 examples, and

a set of 20-25 problems• Students were required to review these learning

materials

Preliminary Results 2

• In terms of lab post-tests– No significant results

SectionsFile

IOEvent

DrivenException

sInheritanc

eRecursio

n

151 6.89 8.67 8.56 6.44 9.78

152 6.73 8.09 8.90 6.67 8.00

151 + 152 6.80 8.35 8.74 6.56 8.84

153 7.36 8.55 8.73 7.63 8.84

154 7.96 8.18 8.33 7.78 9.42

153 + 154 7.66 8.36 8.55 7.70 9.13

* Averages, maximum 10 points for each post-test

Preliminary Results 3

• Learning vs. non-learning ILMDA– What does it all mean?

Learning ILMDA

Non-Learning ILMDA

Average number of examples read 2.22 3.05

Average number of problems read 10.95 15.12

Percentage of problems answered correctly 66.2% 60.2%

Average spent on Tutorial (s) 457 276

Average time spent on examples (s) 64 61

Average time spent on problems (s) 74 30

Continued Experiments

• Spring 2005 – 3-hour lectures, 2-hour labs, weekly; five topics– Three lab sections

• 20-25 students each

– Lab 1: Uses statically sequenced examples and problems

– Lab 2: Uses non-learning ILMDA– Lab 2: Uses learning ILMDA– Dependent variables: post-tests, exams,

homework assignments• Objective measure: how a student reaches one of the

most difficult problems and answers it correctly

Summary: Ultimate Goal

• Creating an intelligent tutoring system that is able to learn– how to model students correctly– how to evaluate the course materials that it has been

given– how to apply appropriate strategies in its delivery– how to integrate all three learned expertise or

experience to tutor students better

• Creating an intelligent tutoring system that does not assume that the student modeling, the course materials, and the delivery strategies are always appropriate or correct

Summary: Current Status

• Prototype in place for two semesters in CS1, with five CS1 topics

• Learning mechanisms in place for learning how to deliver materials better

• Basic research work under way for multi-awareness to determine the “inappropriate component”

• Online GUI tools – For content entry (completed)– For student view statistics query (completed)– For instructor view statistics query (completed)– For agent view statistics query (partially)– For content view inspection (completed)