Assessment Workshop I Creating and Evaluating High Quality Assessments

67
Assessment Workshop I Creating and Evaluating High Quality Assessments Dr. Deborah Brady

description

Assessment Workshop I Creating and Evaluating High Quality Assessments. Dr. Deborah Brady. Do Now. Good Morning! Please make sure you sign in (table at left with printer) Look over the 2 handouts: the PowerPoint and the Agenda/Handout Sit with team members if possible - PowerPoint PPT Presentation

Transcript of Assessment Workshop I Creating and Evaluating High Quality Assessments

Page 1: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Assessment Workshop I

Creating and Evaluating High Quality Assessments

Dr. Deborah Brady

Page 2: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Do Now

Good Morning! Please make sure you sign in (table at left with printer) Look over the 2 handouts: the PowerPoint and the

Agenda/Handout Sit with team members if possible Please set your cell phones to vibrate Coffee and… are at table at back; help yourselves;

thank you Central Mass Readiness Center and Tahanto District

Page 3: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Norms

Processing PartnersMovementExit Slips

Planning next classQuestions, resourcesDeborah Brady [email protected]

Page 4: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Information Overload Ahead!

Page 5: Assessment Workshop I Creating and Evaluating  High Quality Assessments

AgendaI. Introductions: Overview

I. Break at about 10, lunch at about 11:30, session ends at about 3:00II. Morning presentation (with frequent processing breaks) and afternoon time

for beginning to planII. High quality Assessments (DESE criteria)

I. Tools to evaluate assessmentsII. Tools to track all educators’ DDMs

I. Quality Tracking ToolII. Educator Alignment Tool

III. Measuring Student GrowthI. Direct measures

I. Local alternatives to determine growthII. Pre-/Post, Holistic Rubrics, Measures over time, Post-test onlyIII. “Standardization” an alternative, but not requiredIV. Indirect measures

IV. Piloting, preparing for full implementation in SY 2015V. TIME to work

Page 6: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Where Are You in This Journey?“Living” Likert Scale

• Adapting present assessments

• Creating new assessments

• Writing to text?

Developing

Assessments

• Alignment of Content

• Rigorous and appropriate expectations

• Plus

Assessing Quality

• Security• Calibration

of standards

• Rubric quality

• Analysis of results: High-M-Low GrowthPiloting

• 2 DDMs per educator

• Directions for teachers

• Directions for students

• Organizing for the actual assessments

• Storing, tracking the information

2015 Full Implementatio

n • Data storage• Data Analysis• L-M-H Growth

Interpreting the

results Student Impact

Page 7: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Carousel Walk/Living Likert Scale1) CAROUSEL WALK

1. Take a walk past each of the phases in this process2. Put a check to the left of each area that you have addressed (even partially)

3. Put an ! next to each bullet/category that you have some concerns about

4. Put a ? Next to any area that seems problematical or is unfamiliar to you

5. Add + if you see something missing that is a concern

2) LIVING LIKERT SCALE6. After your walk, stand by the stage of DDM development where you (and your

team, school, or district) are.

Developing Assessing quality Piloting Fully

ImplementInterpreting the Results

Page 8: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Potential as Transformative ProcessWhen Curriculum, Instruction or Assessment is changed….

Elmore, Instructional Rounds, and the “task predicts performance”

Curriculum

Instruction

Page 9: Assessment Workshop I Creating and Evaluating  High Quality Assessments

District Determined Measures

DEFINITIONDDMs are defined as:“Measures of student learning, growth, and achievement related to the Curriculum Frameworks, that are comparable across grade or subject level district-wide”

TYPES OF MEASURES Portfolio assessments Approved commercial

assessments District developed pre

and post unit and course assessments

Capstone projects

Page 10: Assessment Workshop I Creating and Evaluating  High Quality Assessments

The Role of DDMsTo provide educators with an opportunity to:

Understand student knowledge and learning patterns more clearly

Broaden the range of what knowledge and skills are assessed and how learning is assessed

Improve educator practice and student learning Provide educators with feedback about their

performance with respect to professional practice and student achievement

Provide evidence of an educator’s impact on student learning

Bottom Line: Time to do this is critically important!

Page 11: Assessment Workshop I Creating and Evaluating  High Quality Assessments

District Determined MeasuresRegulations: Every educator will need data from at least 2 different measures

Trends must be measured over a course of at least 2 years

One measure must be taken from State-wide testing data such as MCAS if available (grades 4-8 ELA and Math SGP for classroom educators)

One measure must be taken from at least one District Determined Measure which can include Galileo, normed assessments (DRA, MAP, SAT)

Page 12: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Timeline2013-2014 District-wide training, development of assessments and pilot

2014-2015All educators must have 2 DDMs in place and collect the first year’s data

2015-2016Second year data is collected and all educators receive an impact rating that is sent to DESE

The Development of DDMs

Page 13: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Performance & Impact Ratings

Performance RatingRatings are obtained through data collected from observations, walk-throughs and artifacts Exemplary Proficient Needs Improvement Unsatisfactory

4 Standards plus 2 Goals

Impact RatingRatings are based on trends and patterns in student learning, growth and achievement over a period of at least 2 years of data gathered from DDM’s and State-wide testing High Moderate Low

Page 14: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Student Impact Rating Determines Plan Duration for PST

(not future employment)

Summativ

e Rating

Exemplary 1-yr Self-Directed

Growth Plan2-yr Self-Directed Growth Plan

Proficient

Needs Improvement

Directed Growth Plan

Unsatisfactory Improvement Plan

Low Moderate HighRating of Impact on Student

Learning

Massachusetts Department of Elementary and Secondary Education

14Impact

Ratingon

StudentPerformance

Page 15: Assessment Workshop I Creating and Evaluating  High Quality Assessments

What kinds of assessments will work for administrators, guidance, nurses, school

psychologists?

Use School-wide Growth Measures Use MCAS growth measures and extend them to

all educators in a school Use “indirect measures” such as dropout rates,

attendance, etc., as measures Use Student Learning Objectives (SLOs) Or create measures Pre- and post-tests are generally required to

measure growth except with normed assessments

Page 16: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Indirect Measures Indirect measures of student learning, growth, or achievement provide information about students from means other than student work.

These measures may include student record information (e.g., grades, attendance or tardiness records, or other data related to student growth or achievement such as high school graduation or college enrollment rates).

To be considered for use as DDMs, a link (relationship) between indirect measures and student growth or achievement must be established.

For some educators such as district administrators and guidance counselors, it may be appropriate to use one indirect measure of student learning along with other direct measures;

ESE recommends that at least one of the measures used to determine each educator’s student impact rating be a direct measure.

Page 17: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Indirect Measure Examples Consider Student Support Team (SST) Process for a team

High school SST team example—increase in-depth studies Child Study Team example—make the process consistent district-wide RTI team example—follow the referral process High school guidance example Subgroups of students can be studied (School Psychologist group example)—school

anxiety Social-emotional growth is appropriate (Autistic/Behavioral Program example)—saying

hello Number of times each student says hello to a non-classroom adult on his or her way to gym or

class Number of days (or classes) a student with school anxiety participates Assess level of participation in a class “Spot-check,” for example every Friday for 15 minutes Increase applications to college

IEP goals can be used as long as they are measuring growth (academic or social-emotional)

Page 18: Assessment Workshop I Creating and Evaluating  High Quality Assessments

4503699244/ 25 SGP 230/ 35 SGP

225/ 92 SGP

GROWTH SCORES for Educators Will Need to Be Tabulatedfor All Locally Developed Assessments

MCAS SGP (for students) in this example

Page 19: Assessment Workshop I Creating and Evaluating  High Quality Assessments

What are the requirements?1. Is the measure aligned to content?

Does it assess what is most important for students to learn and be able to do?

Does it assess what the educators intend to teach?

Bottom Line: “substantial” content of course At least 2 standards ELA: reading/writing

Math: Unit exam Not necessarily a “final” exam (unless it’s a high quality exam) 19

Page 20: Assessment Workshop I Creating and Evaluating  High Quality Assessments

2. Is the measure informative? Do the results of the measure inform educators about curriculum,

instruction, and practice?

Does it provide valuable information to educators about their students?

Does it provide valuable information to schools and districts about their educators?

Bottom Line: Time to analyze is essential 20

Page 21: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Five Considerations (DESE)

1. Measure growth

2. Employ a common administration procedure 

3. Use a common scoring process

4. Translate these assessments to an Impact Rating

5. Assure comparability of assessments (rigor, validity).

21

Page 22: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Comparability

Comparable within a grade, subject, or course across schools within a districtIdentical measures are recommended

across a grade, department, courseComparable across grade or

subject level district-wide Impact Ratings should have a consistent

meaning across educators; therefore, DDMs should not have significantly different levels of rigor

22

Page 23: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Two Considerations for Local DDMs,1. Comparable across schools

Where possible, measures are identical Easier to compare identical measures Do identical measures provide meaningful information about all students?

Exceptions: When might assessments not be identical? Different content (different sections of Algebra I) Differences in untested skills (reading and writing on math test for ELL

students) Other accommodations (fewer questions to students who need more time)

NOTE: Roster Verification and Group Size will be considerations by DESE 23

Page 24: Assessment Workshop I Creating and Evaluating  High Quality Assessments

“Common Sense” The purpose of DDMs is to assess Teacher Impact The student scores, the Low, Moderate, and High

growth rankings are totally internal DESE (in two years) will see

MEPIDS and L, M or H next to a MEPID

The important part of this process needs to be the focus: Your discussions about student learning with colleagues Your discussions about student learning with your

evaluator An ongoing process

Page 25: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Writing to Text and PARCCThe Next Step? The 2011 MA Frameworks Shifts to the Common Core

Complex Texts Complex Tasks Multiple Texts Increased Writing

A Giant Step?Increase in cognitive load

Mass Model Units—PBL with Performance-Based Assessments (CEPAs) PARCC assessments require matching multiple texts

Page 26: Assessment Workshop I Creating and Evaluating  High Quality Assessments

2. Comparable across the District Aligned to your curriculum (comparable content) K-12 in all disciplines

Appropriate for your students Aligned to your district’s content Informative, useful to teachers and administrators

“Substantial” Assessments (comparable rigor): “Substantial” units with multiple standards and/or concepts

assessed. (DESE began talking about finals/midterms as preferable recently)See Core Curriculum Objectives (CCOs) on DESE website if you are concernedhttp://www.doe.mass.edu/edeval/ddm/example/

Quarterly, benchmarks, mid-terms, and common end of year exams

NOTE: All of this data stays in your district. Only HML goes to DESE with a MEPID for each educator.

Page 27: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Approaches to Measuring Student Growth

Pre-Test/Post TestRepeated MeasuresHolistic EvaluationPost-Test Only

27

Page 28: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Pre/Post Test Description:

The same or similar assessments administered at the beginning and at the end of the course or year

Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year with the passages changed

Measuring Growth: Difference between pre- and post-test.

Considerations: Do all students have an equal chance of

demonstrating growth?

28

Page 29: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Repeated Measures Description:

Multiple assessments given throughout the year. Example: running records, attendance, mile run

Measuring Growth:GraphicallyRanging from the sophisticated to simple

Considerations:Less pressure on each administration.Authentic Tasks

29

Page 30: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Repeated Measures Example Running Record

309/2

4/201

2

10/3/2

012

10/12

/2012

10/21

/2012

10/30

/2012

11/8/2

012

11/17

/2012

11/26

/2012

12/5/2

012

12/14

/2012

12/23

/2012

1/1/20

13

1/10/2

013

1/19/2

013

1/28/2

013

2/6/20

13

2/15/2

013

2/24/2

013

3/5/20

13

3/14/2

013

3/23/2

013

4/1/20

13

4/10/2

013

4/19/2

013

4/28/2

013

0

10

20

30

40

50

60

70Running Record Error Rate

Low GrowthHigh GrowthMod Growth

Date of Administration

# of errors

Page 31: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Holistic Description:

Assess growth across student work collected throughout the year.

Example: Tennessee Arts Growth Measure System Measuring Growth:

Growth Rubric (see example) Considerations:

Option for multifaceted performance assessments Rating can be challenging & time consuming

31

Page 32: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Holistic Example

32

1 2 3 4

Details

No improvement in the level of detail.One is true* No new details across versions

* New details are added, but not included in future versions.

* A few new details are added that are not relevant, accurate or meaningful

Modest improvement in the level of detailOne is true* There are a few details included across all versions

* There are many added details are included, but they are not included consistently, or none are improved or elaborated upon.

* There are many added details, but several are not relevant, accurate or meaningful

Considerable Improvement in the level of detailAll are true* There are many examples of added details across all versions,

* At least one example of a detail that is improved or elaborated in future versions

*Details are consistently included in future versions

*The added details reflect relevant and meaningful additions

Outstanding Improvement in the level of detailAll are true* On average there are multiple details added across every version

* There are multiple examples of details that build and elaborate on previous versions

* The added details reflect the most relevant and meaningful additions

Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho.  Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts

Page 33: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Post-Test Only Description:

A single assessment or data that is paired with other information

Example: AP exam Measuring Growth, where possible:

Use a baseline Assume equal beginning

Considerations: May be only option for some indirect measures What is the quality of the baseline information?

33

Page 34: Assessment Workshop I Creating and Evaluating  High Quality Assessments

MCAS Has 2 Holistic Rubrics

6 5 4 4 5 6Topic/Development

Rich topic/idea developmentCareful, subtle organizationEffective rich use of language

Full topic/idea developmentLogical organizationStrong detailsAppropriate use of language

Moderate topic/idea development and organizationAdequate, relevant detailsSome variety in language

Rudimentary topic/idea development and/or organizationBasic supporting detailsSimplistic language

Limited or weak topic/idea development, organization, and/or detailsLimited awareness of audience and/or task

Little topic/idea development, organization, and/or detailsLittle or no awareness of audience and/or task

Conventions

Control of sentence structure, grammar, usage, and mechanics, (length and complexity of essay) provide opportunity for student to show control of standard English conventions)

Errors do not interfere with communication and/orFew errors relative to length of essay or complexity of sentence structure, grammar and usage, and mechanics

Errors interfere somewhat with communication and/orToo many errors relative to the length of the essay or complexity of sentence structure, grammar and usage, and mechanics

•Errors seriously interfere with communication AND•Little control of sentence structure, grammar and usage, and mechanics

Page 35: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Post-Test OnlyA challenge to tabulate growthPortfolios

Measuring achievement v. growth

Unit Assessments Looking at growth across a series

Capstone Projects May be a very strong measure of achievement

35

Page 36: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Selecting DDMs

“Borrow, Buy, or Build” PRIORITY:

Use Quality Tool to Assess Each Potential DDM to pilot this year for your school (one district final copy on a computer)

CCOs will help if this is a District-Developed Tool If there is additional time, Use Educator Assessment Tool to begin to

look at developing 2 assessments for all educators for next year

Page 37: Assessment Workshop I Creating and Evaluating  High Quality Assessments

“Tools” to Support the Process

For determining what is important (Core Curriculum Objectives)

For determining adequacy for use as DDM (Quality Tool) “Shifts” of Common Core examples and rubrics For making sure each educator has 2 DDMs (Educator

Alignment) For assessing rigor (Cognitive Complexity Rubric, CEPA

Rubric)

Page 38: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Checklist Tracker

Assessment Quality Checklist Tool Grade and Subject or Course _____________________Potential DDM Name_____________________________Potential DDM Source

Developed within districtFrom another district—indicate which one Commercial—indicate publisher

Type of assessment On-Demand (specific time for administration)Performance/ProjectPortfolioHybridOther

Item typesSelected Response (Multiple choice)Constructed Response (written, oral)Performance/PortfolioTwo or moreOther

Alignment to CurriculumWell-alignedModerately alignedPoorly alignedNot yet aligned

Alignment to Intended RigorWell-alignedModerately alignedPoorly alignedNot yet aligned

Page 39: Assessment Workshop I Creating and Evaluating  High Quality Assessments

MCAS and PARCCThe Curriculum/Assessment Shifts

MCAS ORQs

Math: Application of Concepts ELA: ONLY comprehension not writing

quality Personal narrative, persuasive essay,

literary analysis of any novel MC questions some application Emphasis on content

PARCC Shifts to CC

MC at MUCH HIGHER cognitive level All writing is assessed as writing (unlike

ORQs) NEW Text Types—Writing at far higher level:

Narratives, Informational Text, Arguments Math—Processes, depth of understanding,

beyond application Emphasis on content plus Literacy in ELA, math, social sciences,

science, technology

Page 40: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Critically Important! 1)Rigor and 2)Alignment to

CurriculumRigorous

2011 Massachusetts Frameworks Common Core Shifts

Complex texts Complex tasks Writing to text

Shift in Persuasive Essay (Formal Argument)

Shift in Narrative (More substantial and linked to content)

Shift in Informational Text (organization substantiation)

Math, Science , History/SS frameworks

Aligned to District curriculum

Shifted to new expectations Shifted from MCAS

expectations Consider PARCC This is a district decision

Gradual increments? Giant steps?

Page 41: Assessment Workshop I Creating and Evaluating  High Quality Assessments
Page 42: Assessment Workshop I Creating and Evaluating  High Quality Assessments

• Students begin by reading an anchor text that introduces the topic.

• EBSR and TECR items ask students to gather key details about the passage to support their understanding.

• Students read two additional sources and answer a few questions about each text to learn more about the topic, so they are ready to write the final essay and to show their reading comprehension.

• Finally, students mirror the research process by synthesizing their understandings into a writing that uses textual evidence from the sources.

Understanding the Research Simulation Task

42

Page 43: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Use what you have learned from reading “Daedalus and Icarus” by Ovid and “To a Friend Whose Work Has Come to Triumph” by Anne Sexton to write an essay that provides an analysis of how Sexton transforms Daedalus and Icarus.

As a starting point, you may want to consider what is emphasized, absent, or different in the two texts, but feel free to develop your own focus for analysis. Develop your essay by providing textual evidence from both texts. Be sure to follow the conventions of standard English.Thus, both comprehension of the 2 texts and the author’s craft are being assessed along with the ability of the student to craft a clear argument with substantiation from two texts.

Grade 10 Prose Constructed-Response Item

43

Page 44: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Range: Example of assessing reading across the disciplines and helping to satisfy the 70%-30% split of informational text to literature at the 9-11 grade band (Note: Although the split is 70%-30% in grades 9-11, disciplines such as social studies and science focus almost solely on informational text. English Language Arts Teachers will have more of a 50%-50% split between informational and literary text, with informational text including literary non-fiction such as memoirs and biographies.)

Quality: The texts in this set about Abigail Adams represent content-rich nonfiction on a topic that is historically significant.

Complexity: Quantitatively and qualitatively, the passages have been validated and deemed suitable for use at grade 11.

Texts Worth Reading?

44

Page 45: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Text Types, their Shifts, Rubrics for eachText type Shifts with the

Common CoreCC Rubric links Essential

ElementsNarrative No longer personal

story; content bearinghttp://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtml or The 1.0 Guidebook to LDC orhttp://www.parcconline.org/samples/english-language-artsliteracy/grades-6-11-generic-rubrics-draft

• Bears content• Story elements

support content

Informational Text

Content area articles, non-fiction, biography, even literary historical

http://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtmlor The 1.0 Guidebook to LDCor http://www.parcconline.org/samples/english-language-artsliteracy/grades-6-11-generic-rubrics-draft

• Provides information

• Many genres • Scientific article,

feature story, biography, speech

Argument Not persuasive essay with one voice, but a more academically balanced multiple perspective, but with claims and evidence by the writer

http://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtmlOr The 1.0 Guidebook to LDC or http://www.parcconline.org/samples/english-language-artsliteracy/grades-6-11-generic-rubrics-draft

• Balanced presentation of multiple points of view

• Claims/Evidence• Citations

Page 46: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Shifted Analytical Writing

Claims Evidence Use of textural evidence Multiple perspectives

Page 47: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Template for the Argument from They Say/I SayThey Say

(major claims, quoted)I Say(What does this mean)

Your analysis as it connects to the thesis of the paper

Template TemplateThe character saysThis meansMore simply, this means

Connecting what they say it to a paragraph

Connecting your interpretation to a paragraph

When Sidney Carton says, “It is a far, far better….”

He is declaring that his sacrifice is something new for him, and this martyrdom will bring him to a better place, his own resurrection, than he has ever experienced in his corrupt life before this final act.

The hope that Dickens’ sees for social justice is shown in Carton’s selfless act to save Darnay.

In A Tale of Two Cities, Dickens uses the characters to represent the corruption and the hope for social justice in England and France. The final chapter shows the hope that Dickens sees despite the corruption. When Sydney Carton says, “It’s a far, far….known” ( ), he symbolizes the possibilities for reform and redemption. Carton is declaring that his sacrifice is new for him and that he will find a better place, his own resurrection, than he has ever experienced in his corrupt life.

Page 48: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Templates to scaffold a smoothly written analysis or argument James Burke)

They Say What others say about this

claim and topic Quoted appropriately Cited appropriately Worked into whole essay

smoothly

I Say I make a claim for the whole argument I explain what “they say” I am responsible for organizing the

claims, the evidence, and my explanations

I am responsible for making links between/among the sources using transitional sentences and transitional words.

In contrast,…. Like….. Somewhat similar to…

Page 49: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Shifted Informational/Explanatory Writing Conveys information accurately Serves one or more of the following purposes:

Increase a reader’s knowledge about the subjectHelps readers understand a procedure or processProvides readers with an enhanced comprehension of a

concept

Appendix A CC p 23

Page 50: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Shifted Narrative Examples In the service of information

Science—read article and retell the story from the perspective or the scientist who was in disagreement with the evidence

Math—look at the solution to this problem which has some problems. Create a dialogue between you and this student in a peer discussion in which you tell a peer what is good about and what he needs to do to improve his work

History—read the newspaper article written during Lincoln’s time written by one of his rivals. Write a narrative of a meeting between him and President Lincoln in which Lincoln answers some of this person’s objections to his policy based upon the information in the Gettysburg Address

Page 51: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Delaware Rubricshttp://www.doe.k12.de.us/aab/English_Language_Arts/writing_rubrics.shtml

K-12 Argument Rubrics K-12 Informational Writing Rubrics K-12 Narrative Writing Rubrics

NOTE:Holistic rubric for faster scoringMultiple criteria provides more pointsNo point system for rubrics is perfect; you’ll need to

validate the results with student work.

Page 52: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Standards Based versus Common Core Gatsby Unit

How Great is Gatsby? Living Likert Scale Partnered evidence and

counter argument Thesis Argument with rating (110) Counter Argument Conclusion Academic Critique

Examining Author’s Purpose

and Point of View Living Likert; partnered evidence

gathering Individual Essay

Fitzgerald’s purpose in writing the novel

Filtered through Nick’s perspective Interpreted by movies

Scripts, words Images Modifications

Authentic writing: Rotten Tomatoes

Page 53: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Writing to TextStudent Side of Notebook

Daisy as distant dream/as foil

Writing to Text SampleClassroom Side of Notebook

How Great Was Gatsby? Fitzgerald’s purpose Nick’s point of view Daisy as dream/foil Evidence from novel

Three movie versions Select a scene, image, chapter,

or a series of scenes. Which of the images portrays

Fitzgerald’s Gatsby? Thesis: 3-5 examples So what?

Gatsby is distant and never truly close to his

dream though he doesn’t realize it.

DeCaprio’s intensity is too strong for the cool dreamer.

Page 54: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Example of a Strong and Weak Text SetStrong Text Set Weak Text Set

Anchor Text: Fahrenheit 451, Ray Bradbury Anchor Text: Fahrenheit 451, Ray Bradbury

Related Texts: • “You Have Insulted Me: A Letter,” Kurt

Vonnegut (Informational)• “Burning a Book” by William Stafford

(Poem)• “The Book Burnings,” United States

Holocaust Memorial Museum (Informational)

• Excerpts from The Book Thief, Marcus Zusak (Appendix B Exemplar)

• “Learning to Read and Write,” Frederick Douglass (Informational)

• “Learning to Read,” Malcolm X (Informational)

• “Unto My Books So Good to Turn,” Emily Dickinson (Poem)

• “The Portable Phonograph,” Walter Van Tilburg Clark

Related Texts:• “‘

Chaos:’ Gunman Ambushes, Kills Two Firefighters at New York Blaze,” Catherine Shoichet and Greg Botelho (CNN) (Informational)

• “Johannes Gutenberg and the Printing Press,” Mary Bellis (About.com) (Informational)

• Fahrenheit 451, Francois Truffaut (Film)• “About Ray Bradbury: Biography”

(Informational)• “The Pedestrian,” Ray Bradbury (Literary)• The Children’s Story, James Clavell

(Literary)

Page 55: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Other Subjects and CoursesThese assessments include both traditionally tested and non-tested grades. Districts may choose to select a DDM that meets the traditionally non-tested grade/subject or course minimum pilot requirement from this collection.ELA Math

  ELA Literacy Assessments  ELA CCOs

  Math Assessments  Math CCOs

History and Social Sciences Science and Technology  

History & Social Studies Assessments  History & Social Studies CCOs

  Science and Technology Assessments  Science and Technology CCOs

Arts Foreign Language  

Arts Literacy Assessments  Arts CCOs

  Foreign Language Assessments  Foreign Language CCOs

Comprehensive Health 

Comprehensive Health Assessments  Comprehensive Health CCOsCommunications & Information Sciences

  Other Subjects Assessments  Other Subjects CCOs

Page 56: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Core Curriculum Objectives(CCOs—partial list for Writing to Text)

# Objective

1 Students analyze how specific details and events develop or advance a theme, characterization, or plot of a grade 9 literary text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.

2 Students analyze how the structure, syntax, diction, and connotative or figurative meanings of words and phrases inform the central idea or theme of a grade 9 literary text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.

3 Students analyze how specific details, concepts, or events interact to develop or advance a central idea of a grade 9 informational text, and they support their analysis with strong and thorough textual evidence that includes inferences drawn from the text.

4 Students analyze how cumulative word choice, rhetoric, syntax, diction, and the technical, connotative, or figurative meanings of words and phrases support the central idea or author’s purpose of a grade 9 informational text.

5 Students produce clear and coherent writing to craft an argument, in which the development, organization, and style are appropriate to their task, purpose, and audience, using such techniques as the following:

introducing precise claim(s), distinguishing the claim(s) from alternate or opposing claims, and creating an organization that establishes clear relationships among claim(s), counterclaims, reasons, and evidence;

developing claim(s) and counterclaims fairly, supplying evidence for each while pointing out the strengths and limitations of both in a manner that anticipates the audience’s knowledge level and concerns;

using words, phrases, and clauses to link the major sections of the text, create cohesion, and clarify the relationships between claim(s) and reasons, between reasons and evidence, and between claim(s) and counterclaims;

establishing and maintaining a formal style and objective tone while attending to the norms and conventions of the discipline in which they are writing;

providing a concluding statement or section that follows from and supports the argument presented; and

demonstrating command of the conventions of Standard English.

Page 57: Assessment Workshop I Creating and Evaluating  High Quality Assessments

ELA-Literacy — 9 English 9-12https://wested.app.box.com/s/pt3e203fcjfg9z8r02siAssessment

Hudson High School Portfolio Assessment for English Language Arts and Social Studies

Publisher Website/Sample

Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments.

Traditional Assessment

Non-Traditional Assessment

Administration/ Scoring

Traditional End-of-Grade Assessment Pre/Post or Repeated

Measures Paper/Pencil

Traditional End-of-Course Assessment Performance Task Rubric Computer Supported

Selected Response Portfolio or Work Sample Rubric Computer Adaptive

Short Constructed Response Project-Based Rubric Machine Scored

Writing Prompt/Essay Observation Rubric or Checklist Scored Locally

Other: Scored Off-Site

Page 58: Assessment Workshop I Creating and Evaluating  High Quality Assessments
Page 59: Assessment Workshop I Creating and Evaluating  High Quality Assessments
Page 60: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Other Tools: MA Model Curricula and Rubrics CEPAs

( Also, Delaware rubrics for specific text types)  1 2 3 4 5 6

Topic development:

The writing and artwork identify the habitat and provide details

 

Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task

Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task

Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language

Moderate topic/idea development and organization Adequate, relevant details Some variety in language

Full topic/idea development Logical organization Strong details Appropriate use of language

Rich topic/idea development Careful and/or subtle organization Effective/rich use of language

Evidence and Content Accuracy: writing includes academic vocabulary and characteristics of the animal or habitat with details

Little or no evidence is included and/orcontent is inaccurate

Use of evidence and content is limited or weak

Use of evidence and content is included but is basic and simplistic

Use of evidence and accurate content is relevant and adequate

Use of evidence and accurate content is logical and appropriate

A sophisticated selection of and inclusion of evidence and accurate content contribute to an outstanding submission

Artwork; identifies special characteristics of the animal or habitat, to an appropriate level of detail

Artwork does not contribute to the content of the exhibit

Artwork demonstrates a limited connection to the content (describing a habitat)

Artwork is basically connected to the content and contributes to the overall understanding

Artwork is connected to the content of the exhibit and contributes to its quality

Artwork contributes to the overall content of the exhibit and provides details

Artwork adds greatly to the content of exhibit providing new insights or understandings

Page 61: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Sample DDMs—Local Digital PortfolioHudson, MA Buy, Borrow, Build Each sample DDM is evaluatedHudson’s Evaluation: Designed to be a measure of student growth over time in high school ELA and social science courses. Student selects work samples to include and uploads them to electronic site. Includes guiding questions for students and scoring criteria. Scoring rubric for portfolio that can be adapted for use in all high school ELA and social science courses. Generalized grading criteria for a portfolio. Could be aligned to a number of CCOs, depending on specification of assignments. Many are standardized assessments

Page 62: Assessment Workshop I Creating and Evaluating  High Quality Assessments

School MEPID

Last Name

First Name

Grade

Subject Course Course ID

Potential DDM1

Potential DDM2

Potential DDM3

HS 07350

Smith Abby 10 ELA Grade 10 ELA

01051 MCAS ELA 10 NO

   

HS 07350

Smith Abby 9 ELA World Studies

01058  Writing to text 9

 

HS 07350

Smith Abby 9 ELA Grade 9 ELA

01051     Writing to text 9

HS 07352

Smith Brent 10 Math IMM 2      

HS 07352

Smith Brent 10 Math IMM 1   MCAS MATH10 NO

   

HS 07353

Smith Cathy 11 Science Physics Physics(singleton)

   

Educator Alignment Tool

www.doe.mass.edu/

Page 63: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Next Class: Protocols to Use Locally for Inter-Rater Reliability; Looking at Student Work

Possible focus:Developing “text sets”: resources, on-line and textsDeveloping effective rubrics for large-scale assessmentDeveloping exemplarsCalibrating scoresLooking at Student Work (LASW) http://Nsfharmony.org/protocol/a_z.htmlSample for Developing Rubrics from an assessment

Page 64: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Pilot Steps:

1. Prepare to pilot Build your team Identify content to assess Identify the measure

Aligned to content Informative

Decide how to administer & score2. Test

Administer Score

3. Analyze4. Adjust

64

Page 65: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Where to Begin Today

Quality Checklist Tool

Quality Alignment Tool Alignment to content/curriculum Alignment to rigor If the assessment passes these

criteria: Then validity and reliability Then instructions, procedures

for assessment, etc.

Educator Alignment Tool

Preparing for June 1, 2014 report

All educators 2 DDMs

Page 66: Assessment Workshop I Creating and Evaluating  High Quality Assessments

Exit SlipsTentative Topics Testing protocols for consistency across grades, teams, departments, schools Protocols to maintain inter-rater reliability (blind assessment) Mock assessment using rubric/exemplars Rubric quality (I’ve found this to be a concern when I’ve looked closely at some assessments.) Data organization, analysis, assigning “scores” to teachers Determining “cut” scores with local assessments Organizing for June report to DESE (2 assessments per educator) Planning for implementation of these many assessments in 2015

When Accommodations Windows for assessment Security Do singleton teachers assess their assessments?

Time to develop local protocols and directions

What is your priority? What do you need to be successful?

A pause to remember what we are doing here

(How changing assessment positively can bring a positive change in Instruction and curriculum)

Page 67: Assessment Workshop I Creating and Evaluating  High Quality Assessments

“Perfect is the enemy of good.”Voltaire