Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International...

68
Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument DR K-12 grant #0733172, “Application of Evidence-Centered Design to State Large-Scale Science Assessment.” NSF Discovery Research K-12 PI meeting, November 10, Washington D.C. This material is based upon work supported by the National Science Foundation under Grant No. DRL- 0733172. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National

Transcript of Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International...

Page 1: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Robert J. Mislevy University of MarylandGeneva Haertel & Britte Haugan Cheng SRI International

Robert J. Mislevy University of MarylandGeneva Haertel & Britte Haugan Cheng SRI International

Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument

Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument

DR K-12 grant #0733172, “Application of Evidence-Centered Design to State Large-Scale Science Assessment.”

NSF Discovery Research K-12 PI meeting, November 10, Washington D.C.

This material is based upon work supported by the National Science Foundation under Grant No. DRL- 0733172. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

DR K-12 grant #0733172, “Application of Evidence-Centered Design to State Large-Scale Science Assessment.”

NSF Discovery Research K-12 PI meeting, November 10, Washington D.C.

This material is based upon work supported by the National Science Foundation under Grant No. DRL- 0733172. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Page 2: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

OverviewOverview Design patterns

Background Evidence-Centered Design

Main idea Layers

Assessment Arguments Attributes of Design Patterns

How they inform task design

Page 3: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design PatternsDesign Patterns

Design Patterns in Architecture

Design Patterns in Software Engineering

Polti’s Thirty-Six Dramatic Situations

Page 4: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Messick’s Guiding QuestionsMessick’s Guiding Questions

What complex of knowledge, skills, or other attributes should be assessed?

What behaviors or performances should reveal those constructs?

What tasks or situations should elicit those behaviors?

Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13-23.

Page 5: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Evidence-Centered Assessment DesignEvidence-Centered Assessment Design

Organizing formally around Messick quote Principled framework for designing, producing,

and delivering assessments Conceptual model, object model, design tools Connections among design, inference, and

processes to create and deliver assessments. Particularly useful for new / complex

assessments. Useful to think in terms of layers

Page 6: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

From Mislevy & Riconscente, in press

Assessment DeliveryAssessment DeliveryStudents interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Assessment Implementation

Assessment Implementation

Conceptual Assessment Framework

Conceptual Assessment Framework

Domain ModelingDomain Modeling

Domain AnalysisDomain Analysis What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

Design structures: Student, evidence, and task models. Generativity.

Design structures: Student, evidence, and task models. Generativity.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Layers in the assessment enterpriseLayers in the assessment enterprise

Page 7: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

From Mislevy & Riconscente, in press

Assessment DeliveryAssessment DeliveryStudents interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Assessment Implementation

Assessment Implementation

Conceptual Assessment Framework

Conceptual Assessment Framework

Domain ModelingDomain Modeling

Domain AnalysisDomain Analysis What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

Design structures: Student, evidence, and task models. Generativity.

Design structures: Student, evidence, and task models. Generativity.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Page 8: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

From Mislevy & Riconscente, in press

Assessment DeliveryAssessment DeliveryStudents interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Assessment Implementation

Assessment Implementation

Conceptual Assessment Framework

Conceptual Assessment Framework

Domain ModelingDomain Modeling

Domain AnalysisDomain Analysis What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

Design structures: Student, evidence, and task models. Generativity.

Design structures: Student, evidence, and task models. Generativity.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

• Assessment argument structures• Design Patterns

Page 9: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

From Mislevy & Riconscente, in press

Assessment DeliveryAssessment DeliveryStudents interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Assessment Implementation

Assessment Implementation

Conceptual Assessment Framework

Conceptual Assessment Framework

Domain ModelingDomain Modeling

Domain AnalysisDomain Analysis What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

Design structures: Student, evidence, and task models. Generativity.

Design structures: Student, evidence, and task models. Generativity.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

• Psychometric models• Automated scoring• Task templates• Object models•Simulation environments

Page 10: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

From Mislevy & Riconscente, in press

Assessment DeliveryAssessment DeliveryStudents interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Assessment Implementation

Assessment Implementation

Conceptual Assessment Framework

Conceptual Assessment Framework

Domain ModelingDomain Modeling

Domain AnalysisDomain Analysis What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

Design structures: Student, evidence, and task models. Generativity.

Design structures: Student, evidence, and task models. Generativity.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

• Authoring interfaces• Simulation environments• Re-usable platforms & elements

Page 11: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

From Mislevy & Riconscente, in press

Assessment DeliveryAssessment DeliveryStudents interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Students interact with tasks, performances evaluated, feedback created. Four-process delivery architecture.

Assessment Implementation

Assessment Implementation

Conceptual Assessment Framework

Conceptual Assessment Framework

Domain ModelingDomain Modeling

Domain AnalysisDomain Analysis What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

What is important about this domain?What work and situations are central in this domain?What KRs are central to this domain?

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

How do we represent key aspects of the domain in terms of assessment argument. Conceptualization.

Design structures: Student, evidence, and task models. Generativity.

Design structures: Student, evidence, and task models. Generativity.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

• Interoperable elements• IMS/QTI, SCORM• Feedback / instruction / reporting

Page 12: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Toulmin’s Argument StructureToulmin’s Argument Structure

Claim

Backing

unless

sinceWarrant

Alternativeexplanationso

Data

Page 13: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Assessment Argument StructureAssessment Argument Structure

Claim about student

Warrant for assessment

argument since

Alternative explanations

unless

Data concerning

performance

so

Page 14: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Assessment Argument StructureAssessment Argument Structure

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

performance

Data concerning

situation

Page 15: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Assessment Argument StructureAssessment Argument Structure

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Student acting inassessment situation

Warrant for scoring sincesince

Warrant fortask design

Page 16: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Assessment Argument StructureAssessment Argument Structure

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

e.g., near or far transfer, familiarity with tools, assessment format, representational forms, evaluation standards, task content & context.

Not in measurement models, but crucial to inference.

Page 17: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

PADI Design PatternsPADI Design Patterns

Structured around assessment arguments

Substance based on recurring principles, ways of thinking, inquiry, etc. E.g., NSES on inquiry, unifying themes Science ed. & cog psych research

Page 18: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Some PADI Design PatternsSome PADI Design Patterns

Model-Based Reasoning Model Formation; Evaluation; Revision; Use Model-Based Inquiry

Design under Constraints Generate Scientific Explanations Troubleshooting (with Cisco) Assessing Epistemic Frames (in

progress; with David Williamson Shaffer)

Page 19: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features of

tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observations

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

The Structure of Assessment Design Patterns

The Structure of Assessment Design Patterns

Page 20: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

Page 21: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

The design pattern is organized around Focal KSAs. They will be involved in the Claim, although

there may be other KSAs that are included in the target of inference (e.g., Model Formation—but what

models, what context?).

Associated with Characteristic Features of Tasks.

Page 22: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

The Rationale provides background into the nature of the Focal KSAs, and the kinds of things that people do in what kinds of situations that evidence it. It contributes to the

Warrant in the assessment argument.

Page 23: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

Additional KSAs play multiple roles. You need to think about which ones you really DO want to include as targets of inference

(validity) and which ones you really DON’T (invalidity).

Page 24: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

The Additional KSAs you DO want to include as targets of

inference are part of the claim. E.g., knowing Mendel’s laws as well as being able to formulate a

model in an investigation.Connected with Variable Features of Tasks.

Page 25: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

The Additional KSAs you DON’T want to include as

targets of inference introduce alternative explanations for poor

performance. (Especially important for assessing special

populations – UDL & acommodations.)

Connected with Variable Features of Tasks & Work Products.

Page 26: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

The Characteristic Features of Tasks help you think about critical data concerning the

situation –what you need to get evidence about the Focal KSAs.

Page 27: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

Variable Features of Tasks also help you think about data

concerning the situation – but now to influence difficulty …

or to bring in or reduce demand for Additional KSAs to avoid alternative explanations.

Page 28: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

Some Variable Features of Tasks help you match features of tasks and background / knowledge /

characteristics of students: Interests, familiarity, previous

instruction.

Page 29: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

Potential Work Products help you think about what you want to capture from a performance –product, process, constructed

model, written explanation, etc.Can also call attention to demand for Additional KSAs, & avoid alternative explanations (e.g., Stella)

Page 30: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

Potential Observations are possibilities for the qualities of Work Products – i.e., the data concerning the performance.

Page 31: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

ATTRIBUTEATTRIBUTE DESCRIPTIONDESCRIPTION

Focal Knowledge,

Skills, Abilities

The primary knowledge / skills / abilities (KSAs) targeted by this design pattern.

Rationale How/why this DP addresses evidence about focal KSAs

Additional KSAs

Other knowledge/skills/abilities that may be required by tasks.

Characteristic features

of tasks

Aspects of assessment situations that are needed to evoke evidence about the focal KSAs.

Variable features of

tasks

Aspects of assessment situations that can be varied to shift difficulty or focus.

Potential work

products

What students actually say, do, or make, to produce evidence.

Potential observation

s

Aspects of work products we might identify and evaluate, as evidence about students’ KSAs.

Potential rubrics

Ways of evaluating work products to produce values of observations.

Claim about student

Warrant for assessment

argument

Alternative explanations

since

so

unless

Data concerning

situation

Data concerning

performance

Other information concerning

student vis a vis assessment

situation

Student acting inassessment situation

sinceWarrant for

scoring sinceWarrant fortask design

How Design Patterns Support Thinking about the Assessment

Argument

How Design Patterns Support Thinking about the Assessment

Argument

And Potential Rubrics are algorithms/rubrics/rules for

evaluating Work Products to get the data concerning the

performance.

Page 32: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

For more information…For more information…

PADI: Principled Assessment Design for Inquiry

http://padi.sri.com NSF project, collaboration with SRI et al. Links to follow-on projects

Bob Mislevy home page http://www.education.umd.edu/EDMS/mislevy/ Links to papers on ECD Cisco applications

Page 33: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Now for the Good Stuff …

Now for the Good Stuff …

Examples of design patterns with content Different projects Different grain sizes Different users

How they evolved to suit needs of users Same essential structure Representations, language,

emphases, and affordances tuned to users and needs

How they are being used

Page 34: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Use of Design Patterns in STEM Research and Development Projects

Britte Haugan Cheng and Geneva Haertel

DRK-12 PI Meeting, November 2009

Page 35: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Current Catalog of Design PatternsCurrent Catalog of Design Patterns

ECD/PADI related projects have produced over 100 Design Patterns

Domains include: science inquiry, science content, mathematics, economics, model-based reasoning

Design Patterns span grades 3-16+ Organized around themes, models, and processes, not

surface features or formats of tasks Support the design of scenario-based, multiple choice,

and performance tasks The following examples show how projects have used

and customized Design Patterns in ways that suit their needs and users

Page 36: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Example 1 DRK-12 ProjectAn Application of ECD to a State, Large-scale Science Assessment

Example 1 DRK-12 ProjectAn Application of ECD to a State, Large-scale Science Assessment

Challenge in Minnesota Comprehensive Assessment of science: How to design scenario-based tasks, technology-enhanced

interactions, grounded in standards both EFFICIENTLY and VALIDLY. Design Patterns support storyboard writing and task authoring

Designers are committee of MN teachers, supported by Pearson Project focuses on a small number of Design Patterns for “hard-to-

assess” science content/inquiry Based on Minnesota state science standards and benchmarks and the

NSES inquiry standards Design Patterns are Web-based and interactive

Page 37: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Observational InvestigationDesign Pattern Observational Investigation

Relates science content/processes to components of assessment argument Higher-level, cross-cutting themes, ways of thinking, ways

of using science, rather than many finer-grained standards Related to relevant standards and benchmarks

Interactive Features: Examples and details

Activate pedagogical content knowledgePresents exemplar assessment tasksProvides selected knowledge representations

Links among associated assessment argument components

Page 38: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Observational InvestigationDesign Pattern Observational Investigation

Page 39: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Observational Investigation (cont.)Design Pattern Observational Investigation (cont.)

Page 40: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Observational Investigation (cont.)Design Pattern Observational Investigation (cont.)

Page 41: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature: DetailsInteractive Feature: Details

Page 42: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature:Linking assessment argument components

Interactive Feature:Linking assessment argument components

Page 43: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern HighlightsObservational InvestigationDesign Pattern HighlightsObservational Investigation

Relates science content/processes to components of assessment argument Higher-level, cross-cutting themes, ways of thinking, ways

of using science, rather than many fine-grained standards Interactive Features:

Examples and details Activates pedagogical content knowledgePresents exemplar assessment tasksProvides selected knowledge representationsRelates relevant standards and benchmarks

Links among associated assessment argument components

Page 44: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Reasoning about Complex SystemsDesign Pattern Reasoning about Complex Systems

Relates science content/processes to components of assessment argument Across scientific domains and standards Convergence among the design of instruction,

assessment and technology Interactive Features:

Explicit support for designing tasks around multi-year learning progression

Page 45: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Reasoning about Complex SystemsDesign Pattern Reasoning about Complex Systems

Page 46: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature:DetailsInteractive Feature:Details

Page 47: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature:Linking assessment argument components

Interactive Feature:Linking assessment argument components

Page 48: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern HighlightsReasoning about Complex SystemsDesign Pattern HighlightsReasoning about Complex Systems

Relates science content/processes to components of assessment argument Across scientific domains and standards Convergence among the design of instruction,

assessment and technology Interactive Feature:

Explicit support for designing tasks around multi-year learning progression

Page 49: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Example 2 Principled Assessment Designs in InquiryModel-Based Reasoning Suite

Example 2 Principled Assessment Designs in InquiryModel-Based Reasoning Suite

Relates science content/processes to components of assessment argument A suite of seven related Design Patterns support curriculum-based

assessment designTheoretically and empirically motivated by Stewart and Hafner (1994), Research

on Problem-Solving: Genetics. In D. L. Gable (Ed.), Handbook of research on science teaching and learning. New York: MacMillan Publishing.

Aspects of model-based reasoning including model formation, model use, model revision, and coordination among aspects of model-based reasoning

Multivariate student model: scientific reasoning and science content

Interactive Feature: Support the design of both:

Independent tasks associated with an aspect of model-based reasoning Steps in a larger investigation comprised of several aspects including model

conceptualization, model use and model evaluation

Page 50: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design PatternModel FormationDesign PatternModel Formation

Page 51: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design PatternModel Formation (cont.)Design PatternModel Formation (cont.)

Page 52: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature:Links among Design PatternsInteractive Feature:Links among Design Patterns

Page 53: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern HighlightsModel-based Reasoning SuiteDesign Pattern HighlightsModel-based Reasoning Suite

Relates science content/processes to components of assessment argument Facilitate the integration of model-based reasoning skills

into any science content area

Serve as basis of a learning progression Interactive Features:

Support the design of both independent tasks associated with an aspect of model-based reasoning and steps in a larger investigation that is comprised of several aspects including conceptualization of a model to its use and evaluation

Explicit supports (links among Design Patterns) for designing both investigations and focused tasks

Page 54: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Example 3Principled Science Assessment Designs for Students with DisabilitiesDesigning and Conducting Scientific Investigations Using Appropriate Methodology

Example 3Principled Science Assessment Designs for Students with DisabilitiesDesigning and Conducting Scientific Investigations Using Appropriate Methodology

Relates science content/processes to components of assessment argument Guide refinement of science assessment tasks across

multiple states by identifying and reducing sources of construct-irrelevant variance

Integrate six categories of Universal Design for Learning (UDL) into the assessment design process:

Perceptual, linguistic, cognitive, motoric, executive, affective

Interactive Feature: Highlight relationships among Additional KSAs, Variable

Task Features and Potential Work Products to reduce construct-irrelevant variance in a systematic manner

Page 55: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Designing and Conducting a Scientific Investigation Using Appropriate Methodology

Design Pattern Designing and Conducting a Scientific Investigation Using Appropriate Methodology

Bob Mislevy
For the display of this dp, it would be good to have a screen shot that highlights a link between an AKSA and a Variable Task Feature, so the viewers will see how thinking about categories of AKSAs naturally links one to design considerations
Page 56: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Designing and Conducting a Scientific Investigation Using Appropriate Methodology (cont.)

Design Pattern Designing and Conducting a Scientific Investigation Using Appropriate Methodology (cont.)

Page 57: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature:Linking Additional KSAs and Potential Work Products

Interactive Feature:Linking Additional KSAs and Potential Work Products

Page 58: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern HighlightsDesigning and Conducting a Scientific Investigation Using Appropriate Methodology

Design Pattern HighlightsDesigning and Conducting a Scientific Investigation Using Appropriate Methodology Relates science content/processes to components of assessment argument Integrate UDL in assessment design process rather than

applying accommodations to an existing taskSupports the selection of task features that reduce construct-

irrelevant variance and enhance the performance of all test takers

Particular attention to knowledge representation and executive processing demands

Further customization of Design Patterns to develop assessment tasks for students with particular disabilities

Interactive Feature: Relate the perceptual and expressive capabilities required

to complete an assessment task to that task’s features (Additional KSAs, Variable Task Features and Potential Work Products)

Page 59: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Example 4Alternate Assessments in Mathematics Describe, extend, and make generalizations about geometric and numeric patterns

Example 4Alternate Assessments in Mathematics Describe, extend, and make generalizations about geometric and numeric patterns

Relates math content/processes to components of assessment argument Standards-based Design Patterns co-designed across three

states to guide the development of statewide assessment tasks for students with significant cognitive disabilities

Integration of six UDL categories into the design process Interactive Feature:

For logistical reasons, Word document used to create Design Patterns

Attributes visualized in accordance with the assessment argument resulting in increased efficiency and improved quality of argument

New arrangement now under development for use in online system

Page 60: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns

Page 61: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Page 62: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Page 63: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Page 64: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature:Horizontal ViewAligning Focal KSAs, Potential Observations and Potential Work Products

Interactive Feature:Horizontal ViewAligning Focal KSAs, Potential Observations and Potential Work Products

Page 65: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Interactive Feature: Horizontal ViewAligning Additional KSAs and Variable Task Features

Interactive Feature: Horizontal ViewAligning Additional KSAs and Variable Task Features

Page 66: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Design Pattern HighlightsDescribe, extend, and make generalizations about geometric and numeric patterns

Design Pattern HighlightsDescribe, extend, and make generalizations about geometric and numeric patterns

Relates math content/processes to components of assessment argument Deconstruction of NCTM expectations to identify KSAs

that are less difficult or tasks that assess related cognitive background knowledge

Supports the principled alignment of task difficulty and scope with challenges to accessibility

Interactive Feature: Use of multiple views of the Design Pattern to support

understanding of the relationship of components of the assessment argument

Increased efficiency of design and validity of assessment argument

Page 67: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

SummarySummary

Design Patterns are organized around assessments and key ideas in science and math, as opposed to surface features of assessment tasks. Support designing tasks that move in ways NSES and NCTM advocate in ways that build on

research and experience Design Patterns support task design for different purposes and different formats (e.g.,

learning, summative, classroom, large-scale, hands-on, P&P, simulations). Especially important for newer forms of assessment

Technology-based, scenario based tasks in Minnesota Scenario-based learning & assessment (Foothill-DeAnza project) Simulation-based tasks (network troubleshooting, with Cisco) Games-based assessment (just starting, with MacArthur project)

Page 68: Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

SummarySummary

Design Patterns are eclectic—they are not associated with any particular underlying theory of learning or cognition; all psychological perspectives can be represented

Document design decisions Represent hierarchical relationships among Focal KSAs,

sequential steps required for the completion of complex tasks, or superordinate, subordinate, and coordinate relations among concepts

Re-usable; a family of assessment tasks can be produced from a single Design Pattern

Enhance the integration of UDL with the evidence-centered design process

Technology makes evident the relationships among Design Pattern attributes and their role in the assessment argument