Choice-based assessments Dan Schwartz .

20
Choice-based assessments Dan Schwartz http://aaalab.stanford.edu

Transcript of Choice-based assessments Dan Schwartz .

Choice-based assessmentsDan Schwartz

http://aaalab.stanford.edu

Assessment and Instruction

Tests often favor full-blown declarative and procedural knowledge. They miss the value of many other types of learning,

and pull for the “mastery” of facts and steps. Preparation-for-future-learning tests can lift out

value of non-paradigmatic learning experiences. Tests include opportunities to learn, and we measure

whether prior learning experiences prepare people to learn from those opportunities.

Experience and Explanation

Hidden value of videogames.Dylan ArenaKidaptive.Com

Arena, D. A., & Schwartz, D. L. (2013). Experience and explanation: Using videogames to prepare students for formal instruction in statistics. Journal of Science Education and Technology.

Interactive PFL

A limitation of PFL research has been the static nature of the learning resources in the PFL tests. Lecture, worked example, text.

A good deal of learning comes through interaction. Self-directed, interactive learning involves a series

of choices. Students who are prepared for future learning

should make good choices about learning.

Choice as the proper outcome of education.

• No time for a deeper discussion about normative goals of education and appropriateness of measuring choice.

• But see over here

• Here, we just present an extended example for consideration.

Free download at MIT Press

Choicelets: Measure learning choices.

Edit

1) Choose Booth 2) Design Poster 3) Choose Focus Group

4) Choose - or + Feedback

TestPoster

5) Read Feedback

7) Post Poster(see Ticket Sales)

Revise?

No

Yes

6) Re-design Poster

Posterlet DorisChin

KristenBlair

MariaCutumisu

Feedback

Both positive (“I like…”) and negative (“I don’t like…) feedback are informative in the game. Built a system that evaluates 21 graphic design elements. Negative does not mean punishing.

Literature indicates that negative feedback better for learning, but risk of ego threat.

To my knowledge, nobody has examined the choice to seek negative feedback. Central to many “design thinking” pedagogies.

Research design

Managed to get about 450 children at two different “game-themed” schools Quest schools in NYC and Chicago.

Children played posterlet for about 20 minutes. We tracked:

Number of times they chose negative feedback (0-9) Number of times they revised (0-3) Quality of posters (based on 21 features) Performance on a posttest of 21 graphic design principles.

Step 1: Internal Validation

For 2+2, there is no argument that 4 is the correct answer.

With choices, the story is different. People can rightly argue, Who are you to say that choosing negative feedback is the right choice for learning.

To address, we examine correlations between choices and measures of learning or performance.

Evidence on Internal Validation(choosing negative feedback leads to better learning)

Negative Feedback

Revision Poster Quality

Post-test

Negative Feedback

____ .48*** .28*** .23***

Revision ____ .34*** .24***

Poster Quality

____ .39***

13

No differences on 1st poster, so not due to some sort of prior knowledgeof graphic design principles.

Step 2: External Validation

Does behavior in the game world say anything about “real” world learning?

Posterlet finding may not reach outside games. Motivations for various choices in a game may not

reflect motivations in the world.

External Validation(We received standardized test scores for a subset

of the students.)

15

NY (n = 119) Chicago (n = 65)

Reading (ELA)

Math (NY Math)

Reading (ISAT)

Math (ISAT)

Negative Feedback

.33*** .39*** .41*** .33**

Revision.08 .28** .31* .21

Nice to have a replication! It is missing in many data mining efforts, not to mention the very idea of external validation.

Step 3: Experience Validation

Ultimately, we want to measure experiences not individuals. Want to help improve instruction and flag kids who could

use some attention.

To know if we can measure the effects of experience, we need to know if experience has an influence on the assessment.

Experience Validation

Summary

Choice to seek negative feedback correlated with better learning. Is there another demonstration like this out there?

Seeking negative feedback seems to be a more stable correlate of learning than choosing to revise.

Like all assessments, if you do not have control over the experience, it is hard to know what experience you are measuring.

Current Activities

Now that we can measure learning choices (so far), we want to show that it is possible to create experiences that influence choice behaviors.

Working to build brief design thinking curricula. Complementary suite of design thinking choicelets. Conduct an experiment where some get curricula, some not.

We’ll see. In the meantime…

Emerging Design Principles PFL Principle

Must be something new to learn within the assessment. Choice Principles

Learning must depend on making a choice. Game can be completed without choosing to learn. (It is an unforced choice.)

Typical Performance Principle Game should be designed to measure typical behavior not test behavior.

Validation Principles Internal – show the choice does lead to learning in game External – show the choice correlates with learning outside game. Experimental – show experiences outside game can influence choices.

Data Abstraction Principles Conceptualize game as having different “regions” of choice. Aggregate regions of choices within rounds, and analyze rounds sequentially.