Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of...

22
Assessment in Support of Learning Planning Instructional Interventions Pearson Assessment Training Institute 23rd Annual Summer Conference July 7, 2016 Jan Chappuis Portland, OR www.janchappuis.com @janchappuis

Transcript of Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of...

Page 1: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

Assessment in Support of Learning

Planning Instructional Interventions  

   

 Pearson Assessment Training Institute

23rd Annual Summer Conference July 7, 2016

Jan Chappuis

Portland, OR www.janchappuis.com

@janchappuis

Page 2: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

Seven Strategies of Assessment for Learning 

“Innovations that include strengthening the practice of formative assessment produce significant and often substantial learning gains.”

--Black & Wiliam, 1998b, p. 140  

D. R. Sadler’s “Indispensable Conditions” for Improvement in Learning 

The student: 

Comes to hold a concept of quality roughly similar to that of the teacher 

Is able to monitor continuously the quality of work produced during the act of production 

Has a repertoire of alternative moves or strategies to employ when faced with incomplete 

mastery 

‐‐Sadler, 1989, p. 21 

 

Where Am I Going? 

Strategy 1: Provide a clear and understandable vision of the learning target. 

Strategy 2: Use examples and models of strong and weak work.  

Where Am I Now? 

Strategy 3: Offer regular descriptive feedback during the learning.  

Strategy 4: Teach students to self‐assess and set goals for next steps.  

How Can I Close the Gap? 

Strategy 5: Use evidence of student learning needs to determine next steps in teaching. 

Strategy 6: Design focused instruction, followed by practice with feedback.  

Strategy 7: Provide opportunities for students to track, reflect on, and share their learning progress. 

 Chappuis, J. 2015. Seven Strategies of Assessment for Learning, 2e. Upper Saddle River, NJ: Pearson, p.11. 

 References and Resources 

Black, P. & D. Wiliam. 1998a. Assessment and classroom learning. Assessment in Education (5)1, 7‐74.  

Black, P. & d. Wiliam. 1998b.Inside the black box :Raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139 – 148. 

Chappuis, J. 2015. Seven strategies of assessment for learning, 2e. Upper Saddle River, NJ: Pearson.  

Chappuis, J., R. Stiggins, S. Chappuis, & Arter, J. 2012. Classroom Assessment for Student Learning: Doing It Right—Using It Well, 2e. Upper Saddle River, NJ: Pearson. 

 Sadler, D.R. 1989. Formative assessment and the design of instructional systems, Instructional Science, 18, 119 – 144.

Page 3: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 3

Planning Instructional Interventions  

When we teach thoughtfully, we actively seek evidence of what students do not “get.” We use assessment processes and instruments with sufficient instructional traction to identify specific learning needs for each student throughout a unit or teaching cycle. And we make sure our diagnostic assessments don’t just tell us “Do something.” We ensure they help answer the question “Do what?”

--Chappuis, 2015, p. 203  Learning is an unpredictable process and instructional correctives are part of the normal flow of attaining mastery in any field. If we are still teaching as though “plan, instruct, and assess” will cause learning, we will miss the heart of teaching and the whole intent of formative assessment. In this session, we will examine ways to elicit evidence of student achievement to guide the selection of instructional next steps.  Effective teachers build a “feedback loop” into their instructional sequence by doing three things: 

1.  They figure out where students are in their learning and what students’ learning needs are throughout 

the instructional sequence for a given learning target.  

2.  They have a repertoire of teaching strategies from which to choose to develop students’ capabilities.  

And they apply these strategies based upon the specific learning needs of their students. 

3.  They plan time in their instructional sequence to take action on the basis of that information. 

Discussion Question:

Think about one unit, topic, or concept you have taught that you can predict students will encounter difficulty with. What problems will they typically have?

 Information in this handout comes from Strategies 5 and 6 in the text Seven Strategies of Assessment for Learning, 2e, by 

Jan Chappuis, published by Pearson in 2015.   

Page 4: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 4

Three Types of Learning Needs  

Errors Due to Incomplete Understanding 

Students have partial understanding. They haven’t learned it wrong, they just haven’t learned it yet. 

Errors Due to Flaws in Reasoning 

Students carry out the pattern of reasoning with one or more missteps. They have taken a reasoning “wrong turn.” 

Errors Due to Misconceptions 

Students have internalized a concept or explanation of a phenomenon that they believe to be true, but that does not match current best thinking. 

  1.  Incomplete Understanding 

The. sore. beGis. The. PEPLo. in. The. TEnt. VApir. bAT

Your example of an error due to incomplete understanding:

 How to address errors due to incomplete understanding: 

From the section titled “Do What?” on page 207 in the Seven Strategies text: When student work exhibits incomplete understanding, next step instruction takes into account that a student hasn’t necessarily inaccurately learned what’s being taught, she just hasn’t learned it completely yet. This type of learning need is fairly straightforward to address. For example, with the primary students using periods between words, the instructional next step is not to mark the periods as wrong, but rather to move students into stringing words together as units of thought and then helping them to use periods where an idea ends. Basically, they know as much about a period as they can deal with and the next step is to teach them more about getting ideas into writing before circling back to periods. So in general, next steps with errors due to incomplete understanding involve further instruction rather than reteaching or correction.  

   

Page 5: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 5

2.  Flaws in Reasoning 

Patterns of reasoning adhere to rules for their successful application and also have inherent in them typical missteps. For example, when students are asked to summarize, they may include details that are not central to the main idea. When they are asked to generalize, they may make a statement that is true for the observed evidence but does not extend to cover a broader array of instances, missing the essence of generalize (e.g., “All dogs I have seen have four legs”). Or, they may make a statement that is true for the observed evidence, but covers too broad an array of other instances, thus overgeneralizing (e.g., “All dogs have four legs”). When asked to determine cause for an effect, they may offer another effect instead of a cause.  These are examples of typical flaws in reasoning. 

Your example of an error due to a flaw in reasoning:

How to address errors due to flaws in reasoning:  

From the section titled “Do What?” on page 208 in the Seven Strategies text: To overcome flaws in reasoning it is important to help students first identify the flaw. A good way to do this is to use Strategies 1 and 2.  Share with students a complete definition of the pattern of reasoning, introduce to them to the rubric you will use to assess that pattern of reasoning, and then have them examine some strong and not‐so‐strong examples and decide, based on the rubric, what the strengths and flaws are in those examples. Practice with Strategies 1 and 2 helps them recognize a well‐executed reasoning process and avoid or correct reasoning flaws they may otherwise demonstrate in their initial attempts at mastery. 

 

3.  Misconceptions 

Misconceptions, also known as alternative conceptions, involve students’ either  having learned something, but having learned it inaccurately, or having internalized an explanation for a phenomenon that does not fit with current best thinking. For example, when primary students are learning about the solar system, one misconception they typically bring to the table is that the phenomenon of night and day is a result of the movement of the sun. (“The sun comes up and the sun goes down.”) You can teach that it is the earth’s rotation that causes our side of the planet to face the sun during the day and then face away from the sun at night. You can demonstrate this using playground balls and a flashlight, papier mache heavenly bodies strung from the ceiling, and simulations (“Partner A‐‐you are the sun; Partner B‐‐you are the earth.”), but if you don’t also address the misconception, when you ask students to draw the process of how we get night and day, you will still see drawings of the sun coming up and the sun going down. Students have simply put the new concept on top of an existing understanding and the existing understanding has not gone away. An example of a typical misconception for secondary students is that support for an opinion can be adequately provided by further opinions.   

Misconceptions can also present themselves as misapplication of a rule. For example, in language arts 

Page 6: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 6

one very common misapplication of a rule concerns correct use of subject and object pronouns. Students learn that “Him and me went to the store” is not correct, but the rule they tend to internalize is that “me” should be used sparingly. And as a result they often use “I” when they should use “me:” “You can give it to him and I.” They are operating with rules, just not the correct ones. 

Your example of an error due to a misconception:

 

How to address errors due to misconceptions:  

From the section titled “Do What?” on pages 209 - 210 in the Seven Strategies text: The challenge with misconceptions is to correctly identify them and then plan lessons to dislodge them. Misconceptions are stubborn.  They can’t be corrected by papering over them.  To illustrate this, let’s work with a misconception from middle school science. Newton’s first law of motion states that a force is not needed to keep an object in motion, yet many students (and adults) will tell you that if an object is in motion, it will require a force to stay in motion, which seems like common sense, and is what Aristotle thought, by the way. Memorizing the principles of the first law‐‐“an object at rest will stay at rest” and “an object will continue with constant velocity unless acted on by an unbalanced force”‐‐is generally not enough to overcome what our senses tell us about force and motion: if you want a book to keep moving across a table, you have to keep pushing it.   To help students understand why a force is not needed to keep an object in motion, we need to use a teaching strategy specifically designed to dig out misconceptions. One approach derived from research on science misconceptions (Hattie, 2009, p. 148) is to do the following: 

Create an initial awareness of the misconception by generating cognitive dissonance in the students’ minds—provide them with an experience (a demonstration, a reading passage, or some combination of the two) that runs counter to or contradicts the misconception in some way.  

Engage students in discussion to uncover the misconception and to contrast it with the correct interpretation.  

Finish by having students explain why the misconception is incorrect, when they are able to do so.  

 Misconceptions, whether in science, social studies, mathematics, language arts or any other discipline, require an intentional approach tailored to the nature of the misconception because the teaching challenge is to cause conceptual change—to have students give up the inaccurate conception they currently hold in favor of an accurate one. Students have to consciously dismantle the old rule or understanding to ensure that the new takes hold in the long‐term memory.   In the case of misconceptions, we can minimize the need to reteach by helping students to recognize and correct them at the outset of instruction or during the learning—whichever is best suited to the misconception you are working with.  One simple approach is to list the  misconceptions, hand the list out to students and as instruction confronts them, periodically ask students to mark, date, and explain those they can now correct (Figure 5.6). Or, you might make a list of major conceptual understandings 

Page 7: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 7

you will address in a given unit, mixing in statements reflecting misconceptions students typically have. Ask students to mark “True,” “False,” or “Unsure” next to each one. Periodically, hand out a fresh copy of the list and have students revisit the statements related to what you have taught to that point, marking the statements as either true or false, accompanied by an explanation: “I think it is true/false because ...” Figure 5.7 shows a variation of this activity with a seventh‐grade mathematics unit. 

Correcting Misconceptions 

 

 Source: Chappuis, J. 2015. Seven Strategies of Assessment for Learning, 2e. Upper Saddle River, NJ: Pearson Education, p. 222.   

Page 8: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 8

Sources of Information for Diagnosing Types of Learning Needs Informal Options 

Note types of problems in student work 

Elicit explanations from students 

Ask probing questions 

Formal Options 

Items on an assessment that have “ instructional traction” 

Rubrics with “instructional traction” 

Discussions with Colleagues

  

Learning Team Activity: Discussions with Colleagues  Activity Directions: 

1. List the learning targets for a unit you are preparing to teach. 2. Think of the difficulties students encounter (or will encounter based on past experience) with each 

learning target. List the difficulties for each learning target. 3. Classify the cause of each difficulty as Incomplete Understanding (IU), Flaw in Reasoning (FR), or 

Misconception (M). 4. For each difficulty, estimate its frequency of occurrence. Does this always happen? Sometimes? Once 

in a while? Also determine importance—how significant a hindrance to learning is the difficulty?  5. Ask one or more colleagues teaching the same content to do this same analysis. 6. Meet with your colleague(s) to compare lists. Add their difficulties to your list if you think your 

students also have them. Once you have a complete list, select the incomplete understandings, flaws in reasoning, and misunderstandings you want to address in either whole‐class or grouped instruction. Consider frequency and importance in your deliberation. Discuss ways to address those problems for which you won’t design whole‐class or grouped instruction. 

 

 Learning Target:  

 

Description of Difficulty Type  Frequency  Importance 

IU  FR  M  High  Med  Low 

  

             

  

             

  

             

 Source: Chappuis, J. 2015. Seven Strategies of Assessment for Learning, 2e. Upper Saddle River, NJ: Pearson Education, p. 229 (Activity 5.4) 

Page 9: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 9

Diagnostic Assessments with Instructional Traction  An assessment itself can yield solid diagnostic information when its results provide evidence that points to specific learning needs. Such assessments are known as instructionally tractable (Andrade, in McMillan, ed, 2013, p. 29).  Not all assessments will work this way, even some assessments purported to be formative in nature. Many assessments yield only a score, which can be used to indicate whether initial instruction was effective or not, and, if it was not effective, that further work on the part of the teacher and the student is needed (Wiliam, in McMillan, ed, 2013, p. 211). Simply identifying that a particular topic needs attention (e.g., “these students have problems with fractions”) does not give an assessment sufficient instructional traction because it is not capable of diagnosing the specific learning need. Selecting or developing assessments with instructional traction requires that we can recognize it when we see it. 

Answer this question with a partner:

1. Which fraction is largest?

a) 1/3 b) 2/5 c) 7/9 d) 40/43

What are typical ways students can get a wrong answer to this item?

Next, answer this one:

2. Which fraction is largest?

a) 2/1 b) 3/8 c) 4/3

Which answer choices reflect which typical misconceptions?   

What would happen if the answer choices looked like this?

3. Which fraction is largest?

a) 2/1 b) 3/8 c) 4/3 d) 40/72

  Assessments of all types—selected response, written response, performance assessment, and personal communication—can be designed to provide results that are instructionally tractable, both as a formal assessment event and as an informal lesson‐embedded activity. There are commercially available assessments that have been designed specifically for this purpose. For example, the DIAGNOSER Web‐based assessment system designed by Minstrell and his colleagues is constructed so that each answer choice points to a different science conception (Cowie, in McMillan, ed., 2013, p 479). Researchers at Educational Testing Service have developed an item bank known as Diagnostic Items in Mathematics and Science that is completely comprised of items whose wrong answer choices reflect research‐based misconceptions and incomplete understandings. If we intend to use assessment results formatively, we should require this feature of the assessments provided to us or design them with this feature ourselves.    

Page 10: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 10

Creating Distractors with Instructional Traction

One way to elicit the distractors that indicate typical misunderstandings and misconceptions is to pose the question as a fill‐in exercise or an open‐ended question. Say, for example I wanted to write some good diagnostic items for the learning target “Select the types of portfolio(s) best suited to your students’ needs” for part of a course on classroom assessment.  I have deconstructed the learning target into the following components and created the following assessment blueprint. 

Construct  Practice Item  Quiz Item  Exercise 

Definition  1    10.2 

Purpose  2  1 

Types of portfolios  3, 4, 5  2, 3 

Annotations    4 

Student involvement in content selection  6   

 

To come up with typical misconceptions or instances of partial understanding for the definition of a portfolio, I could ask students to define the term portfolio. I would then examine their responses and look for commonalities in wrong or partially wrong answers. Students would have an opportunity to read and discuss the course material related to what a portfolio is and isn’t and then I would create a multiple‐choice item with distractors that diagnose the type of problem the students might still have.  1. Which of the following is the best definition of a portfolio?

a) An organizing tool for a collection of artifacts b) An authentic assessment method for products and performances

c) A showcase of most important accomplishments d) A part of proficiency requirements to qualify for graduation

One problem teachers typically have in setting up portfolios is in not first considering the type of portfolio best suited to what they want it to do. A related problem is not being clear about what they want it to do in the first place. These are both problems related to the portfolio’s purpose. To get at this potential problem, I might write an item like this:  2. Which is the foremost consideration when selecting the contents for a portfolio?

a) How the portfolio will be used b) The types of learning targets you are teaching

c) What the student wants in the portfolio d) Which samples of the student’s work are best 

Page 11: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 11

Explanation of the Distractors  If I write a short explanation of why each incorrect answer is incorrect, the answer key can function as a teaching tool—students can use it to understand why their wrong answers are wrong. 

I can also ask them to answer the question “How do you know your answer is correct?” after they complete each item to get at the thinking behind their answer choices. This helps them articulate the reason they have selected a particular answer and it helps me appropriately address errors due to incomplete understanding, flaws in reasoning, and misconceptions. 

Question #1:

Feedback to answer choice a (the correct answer): A portfolio is simply a repository of artifacts chosen purposefully to tell a story.

Feedback to answer choice b (an example of a misconception):

Portfolios are not an assessment method. A portfolio is simply a repository of artifacts chosen purposefully to tell a story.

Feedback to answer choice c (an example of an incomplete understanding):

One purpose for portfolios is to showcase important accomplishments, but that is not the definition. A portfolio is simply a repository of artifacts chosen purposefully to tell a story.

Feedback to answer choice d (an example of an incomplete understanding):

One purpose for portfolios is to demonstrate proficiency, but that is not the definition. A portfolio is simply a repository of artifacts chosen purposefully to tell a story.

Question #2: Feedback to answer choice a (the correct answer):

Form follows function. Which story you want the portfolio to tell (its purpose) is the first determiner of what its contents will be.

Feedback to answer choice b (a misconception):

Form follows function. Which story you want the portfolio to tell (its purpose) is the first determiner of what its contents will be. From that decision point, you can decide which learning targets will be the focus of the story.

Feedback to answer choice c (an instance of incomplete understanding):

Form follows function. Which story you want the portfolio to tell (its purpose) is the first determiner of what its contents will be. For some types of portfolios, the teacher will provide guidelines for student choices.

Feedback to answer choice d (an instance of incomplete understanding):

Form follows function. Which story you want the portfolio to tell (its purpose) is the first determiner of what its contents will be. The student’s best work is included in an achievement status portfolio, but not always in the other types. 

Page 12: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 12

Creating an Item with Instructional Traction Using an Item Formula

Source: Chappuis, J. 2015. Seven Strategies of Assessment for Learning, 2e. Upper Saddle River, NJ: Pearson, page 216. 

   

Page 13: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 13

Marching Off to War  From The Boys’ War by Jim Murphy 

The excitement of enlisting was soon replaced by the reality of serving. A sad and ironic truth about the Civil War was that neither side had expected the disagreements to turn into actual fighting. Both were certain that another compromise could be worked out. As a result, neither side had an army in place or the arms or materials to outfit one. The North was in a slightly better position at the start. It had a strong industrial base from which to produce materials and it had a standing army and arms. The trouble was that it would take almost a full year before its manufacturing plants could retool to supply what the army needed. As for the army itself, it numbered only sixteen thousand when the war began, and many of its officers and soldiers would resign to fight for their Southern states. And while the North did have a sizeable number of cannon and rifles, a surprising proportion of these weapons dated back to the revolutionary war! Early in the war, recruits often found themselves marching in their street clothes and using wooden guns and swords and even cornstalks for training. A lucky unit might find itself outfitted by the proud citizens of its town. This, however, produced a rainbow of uniform colors and styles on both sides. One regiment called itself the Highlands and proudly marched off to battle wearing kilts. The most outlandish uniforms were patterned after those worn by a celebrated French fighting unit known as the Zouaves and consisted of baggy red breeches, purple blouses, and red fezzes. The South’s economic power lay in its production of cotton and not in manufacturing. When Lincoln ordered a complete blockade of all southern ports, problems in the South multiplied. No wonder recruiting posters made it very clear that “the volunteers shall furnish their own clothes.” The result was a hodgepodge of colors for both armies. When the two sides gathered during the first months of the fighting, it looked like a hastily assembled circus parade and not two serious armies. Worse, the variety of uniforms often led to fatal mistakes. With smoke blurring vision and emotions running high, inexperienced troops often fired at anyone wearing the enemy’s color. At least one battle was decided by the color of the uniforms. During the Battle of Bull Run in July 1861, Union artillery was ripping apart the Confederate lines from the top of Henry House Hill. When the blue-uniformed soldiers emerged from the right, Union officers took them to be friendly infantry and did not fire on them. The blue-clad arrivals (who were really Confederate troops from Virginia) marched up and routed the Union forces without much opposition. Deciding on a uniform design and color could be done quickly. But manufacturing uniforms for hundreds of thousands of soldiers would take a great deal of time. Before the war, every soldier had to be measured and the cloth for his uniform cut and sewed by hand. Obviously, outfitting two armies this way would take months and months. To speed things up, manufacturers were given a series of graduated standard measurements for uniforms and shoes, and soon they were mass-producing these items by the thousands. This concept of sizes would be used in the production of civilian clothes after the war.

A

B

C

D

E

F

G

Page 14: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 14

  Item Formula for “Identify Cause and Effect”  

   

Activity Directions:

Work with a partner to craft a multiple choice item to assess “Identify cause and effect.”

Our Question

Our Answer Choices

Right answer:

Distractor:

Distractor:

Distractor:

 

Page 15: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 15

Instruction and Practice with Multiple-Choice Items Item Formula for “Infer”

Partner Activity

 

 

Page 16: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 16

Item Formula for “Make Generalizations” 

 

Example of Activity with Item Created from the Item Formula 

Page 17: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 17

Source: Chappuis, J., R. Stiggins, S. Chappuis, & Arter, J. 2012. Classroom Assessment for Student Learning: Doing It Right—Using It Well, 2e. Upper Saddle River, NJ: Pearson, p. 146.    

Page 18: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 18

Rubrics with Instructional Traction  

Both general and task‐specific rubrics can be used diagnostically, with a few cautions. 

Task‐Specific Rubrics. Task‐specific rubrics cannot be handed out to students in advance because they tell students exactly what to do and not do, thus negating the accuracy of the item in determining level of understanding. However, if constructed correctly, they can work well to provide information about misconceptions and flaws in reasoning. Task‐specific rubrics are usually holistic in structure—having only one scoring scale‐‐because the learning targets they are intended to assess are not usually complex enough to require an analytical structure. 

Task‐specific Rubric Used Diagnostically 

Level 3:  

The response shows that the earth turns so that one side is facing the sun during the day and away from the sun at night. 

Level 2:  

The response shows the sun on one side of the earth and the moon on the other side of the earth. It also shows that the earth turns to face the sun during the day and to face the moon during the night. It does not show the sun moving relative to the earth. 

Level 1:  

The response shows that the sun moves relative to the earth.  General Rubrics. A general rubric is one that describes features of quality for a given learning target in such a way that it can be used across all tasks you may assign for the given content standard. For a general rubric to have instructional traction it must do a good job of describing strengths and weaknesses.   

As you may have experienced, not all rubrics are suited to formative use. Three essential characteristics that make them work formatively are: (1) descriptive of quality; (2) generalizable across tasks, if students are going to also use them; and (3) analytic in structure rather than holistic, if describing a complex learning target.     

Page 19: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 19

 

Page 20: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 20

Rubrics That Are Descriptive of Quality  To be used as an assessment for learning tool, a rubric must be diagnostic. It must describe strengths and weaknesses. Rubrics that use descriptive, rather than evaluative or quantitative, language generally do a far better job of yielding diagnostic information. To illustrate this, we will look at three versions of a rubric for Display of Information, one criterion in a rubric for science investigation.  Only the last version approximates a true description of quality; the first two versions are examples of what not to do.  Version 1: Evaluative Language for “Display of Information” 

4:  Excellent display of information 3:  Good display of information 2:  Fair display of information 1:  Poor display of information 

 Version 2: Quantitative Language for “Display of Information” 

4:  Displays four pieces of information 3:  Displays three pieces of information 2:  Displays two pieces of information 1:  Displays one piece of information 

 Version 3: Descriptive Language for “Display of Information” 

4:  Display of information is accurate, labeled correctly, complete, and organized so that it is easy to interpret. 

3:  Display of information is accurate and labeled correctly. It has one or both of the following problems: it is mostly complete with a small amount of missing information, and/or its organization causes some difficulty in interpretation. 

2:  Display of information is partially accurate and may have some labels missing. It may be only partially complete and may have significant organization problems. 

1:  Display of information is inaccurate, mislabeled (or absent labels), incomplete, and difficult to interpret due to organization problems. 

   

Page 21: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 21

Example of a General Rubric That Is Analytic in Structure and Descriptive of Quality 

 

Page 22: Planning Instructional Interventions Chappuis 2016 · ©2016 J. Chappuis/Assessment in Support of Learning/ Page 3 Planning Instructional Interventions When we teach thoughtfully,

©2016 J. Chappuis/Assessment in Support of Learning/www.janchappuis.com Page 22

Oral Presentation Rubric—Criterion 3: DELIVERY 

 

 5:  Strong 

I maintained eye contact with the audience throughout my speech. 

My voice was loud enough for the audience to hear.  

I varied my voice level and intonation to emphasize meaning. 

I articulated clearly so the audience was able to understand every word. 

I spoke at a pace that kept the audience engaged without racing through my speech. 

I avoided repeatedly using “filler” words between my ideas (e.g., “and,” “uh,” “um,” “you know,” “like,” “well”). 

I used gestures and movement to enhance the meaning of my words.  

I knew my speech well enough so that I could just glance at my notes to help me remember what to say.  

If I used visual aids or props, they helped make my meaning clearer.  

 3:  Part‐Way There 

I made eye contact with my audience part of the time. Or, I only made eye contact with a few people in the audience and I forgot to look around at everyone. 

My voice was loud enough for the audience to hear part of the time, but it also was too quiet at times.  

I varied my voice level and intonation a few times to emphasize meaning, but I may have spoken in a monotone part of the time, too. 

I articulated clearly some of the time, but some of the time I mumbled. 

I spoke at a fairly good pace, but there were times when I spoke too quickly. 

Sometimes I used “filler” words between my ideas (e.g., “and,” “uh,” “um,” “you know,” “like,” “well”). 

My gestures and movement might have been a little stiff or unnatural, but they didn’t distract the audience from the meaning of my presentation. 

I gave parts of my presentation without having to read my notes, but had to read them quite a bit in places. 

If I used visual aids, they were understandable, but they may not have added much to my meaning.  

 1:  Just Beginning 

I had a hard time making eye contact with my audience. I mostly looked up, away, or down.  

My voice was too quiet for everyone to hear me.  

I may have spoken in a monotone, with no variance in intonation. Or I may have tried to vary my voice level and intonation on certain words, but I wasn’t sure which ones to emphasize. 

I mumbled frequently, so the audience had a hard time understanding what I was saying. 

I had a hard time with the speed of my talking—I either raced or dragged through my presentation. 

I used a lot of “filler” words between my ideas (e.g., “and,” “uh,” “um,” “you know,” “like,” “well”). 

My gestures and movement seemed stiff or unnatural, or I moved around so much it distracted the audience from the meaning of my presentation. 

I had to read my notes for most or all of my presentation. 

If I used visual aids, they were confusing. I wasn’t sure how to explain them or how to link them to the ideas I was talking about. 

  Source: The complete Oral Presentation Scoring Rubric can be found on the DVD accompanying Seven Strategies of Assessment for Learning, 2e.