Part I : Assessment September 14, 2010 PaTTAN Lana Santoro

172
Pennsylvania Training and Technical Assistance Network Promising Practices to Improve Reading Performance of Students who are Deaf or Hard of Hearing: Grades 3-6 Pilot Research Project Part I : Assessment September 14, 2010 PaTTAN Lana Santoro

description

Promising Practices to Improve Reading Performance of Students who are Deaf or Hard of Hearing: Grades 3-6 Pilot Research Project. Part I : Assessment September 14, 2010 PaTTAN Lana Santoro. Agenda. Our Research Pilot with Grades 3-6 Diagnostic Assessment - PowerPoint PPT Presentation

Transcript of Part I : Assessment September 14, 2010 PaTTAN Lana Santoro

Page 1: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Pennsylvania Training and Technical Assistance Network

Promising Practices to Improve Reading Performance of Students who are Deaf or Hard of Hearing: Grades 3-6 Pilot Research Project

Part I: Assessment

September 14, 2010PaTTAN

Lana Santoro

Page 2: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Agenda

• Our Research Pilot with Grades 3-6• Diagnostic Assessment• Reading Progress Monitoring

Assessment• Depth of Vocabulary Knowledge• Next Steps

2

Page 3: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Learning Community and Study Group Approach

• Collective Participation• Shared Inquiry• Formative Evaluation• Action Research

7

Page 4: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Sample Information

2008 2009

K

(n=8)

1

(n=4)

2

(n=1)

3

(n=6)

Total Sample (n=19)

K

(n=7)

1

(n=13)

2

(n=3)

3

(n=2)

Total Sample (n=25)

Demographics:

White (Non-Hispanic)

7 3 1 5 16 6 11 2 2 21

Hispanic 1 1 0 0 2 0 1 1 0 2

Black/African American

0 0 0 1 1 0 0 0 0 0

Asian/Pacific Islander

0 0 0 0 0 1 1 0 0 2

Gender:

Male 2 0 1 3 6 2 2 0 1 5

Female 6 4 0 3 13 5 11 3 1 20

Page 5: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Beginning Reading SkillsKindergarten: LNF

LNF - Mean Scores - Kindergarten

0

10

20

30

40

50

60

70

80

90

Fall Winter Spring

2008

2009

DIBELS goal

PA school

Page 6: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Beginning Reading SkillsFirst Grade: WIF

WIF - Mean Scores - First Grade

0

10

20

30

40

50

60

70

80

90

Fall Winter Spring

2009

Goal

Page 7: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reading Connected Text - FluencyFirst-Third: TPRI

First Second Third Fluency Rate - First Grade (2009)

0

20

40

60

80

100

120

BOY EOY

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Student 13

Mean

Goal

Fluency Rate - Second Grade (2009)

0

10

20

30

40

50

60

70

80

90

100

BOY EOY

Student 1

Student 2

Student 3

Mean

Goal

Fluency Rate - Third Grade (2009)

0

20

40

60

80

100

120

140

BOY EOY

Student 1

Student 2

Mean

Goal

Page 8: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reading Connected Text – ComprehensionSecond: MAZE

Second Grade Spring Outcome Gain per Week Gain per Month Benchmarks 8 .23 .88

2008 7.33 .20 .84

2009 11.11

NA NA

MAZE - Mean Scores - Second Grade

0

2

4

6

8

10

12

Fall Winter Spring

2008

2009

Goal

Slope (2008)

Slope (2009)

Page 9: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reading Connected Text – ComprehensionThird: MAZE

Third Grade Spring Outcome Gain per Week Gain per Month Benchmarks 13 .26 1.12

2008 9.50 .40 1.15

2009 12.00

.28 1.22

MAZE - Mean Scores - Third Grade

0

2

4

6

8

10

12

14

16

18

20

Fall Winter Spring

2008

2009

Goal

Slope (2008)

Slope (2009)

Page 10: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reading Connected Text – ComprehensionThird: MAZE

Third Grade Spring Outcome Gain per Week Gain per Month Benchmarks 13 .26 1.12

2008 9.50 .40 1.15

2009 12.00

.28 1.22

MAZE - Mean Gains - Third Grade

0.00

0.20

0.40

0.60

0.80

1.00

1.20

1.40

2008 0.40 1.15

2009 0.28 1.22

Gain/week Gain/month

Goal: .26/week

Goal: 1.12/month

Page 11: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reading Simulation

• Find a partner.• Decide who will be the teacher

(examiner) and student.• Implement the reading assessment

as directed.

15

Page 12: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

16

What do Skilled Readers do?

• Skilled readers differ from unskilled readers in “their use of general word knowledge to comprehend text literally as well as to draw valid inferences from texts, in their comprehension of words, and in their use of comprehension monitoring and repair strategies.” (Snow, Burns, & Griffin, 1998, p. 62)

Page 13: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

17

What we Know About the Factors that Impact Reading Comprehension

• Accurate and fluent word reading skills.

• Language skills (receptive and expressive vocabulary, linguistic comprehension)

• Extent of conceptual and factual knowledge

• Knowledge and skill in use of cognitive strategies to improve comprehension or repair it when it breaks down

• Knowledge of text structure and genre

• Reasoning and inferential skills

• Motivation to understand and interest in task and materials

Page 14: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

18

What Reading Does for the Mind

• Reading comprehension requires knowledge –of words and the world.– E.D. Hirsch, American Educator (Spring

2003)

Page 15: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

19

PRINTED TEXTSAbstracts of scientific articles 4389 128.0Newspapers 1690 68.3Popular magazines 1399 65.7Adult books 1058 52.7Comic books 867 53.5Children’s books 627 30.9Preschool books 578 16.3

TELEVISION TEXTSPopular prime-time adult shows 490 22.7Popular prime-time children’s shows 543 20.2Cartoon shows 598 30.8Mr. Rogers and Sesame Street 413 2.0

ADULT SPEECHExpert witness testimony 1008 28.4College graduates to friends, spouses 496 17.3

(Adapted from Hayes and Ahrens, 1988)

Rank of Median Word

Rare Words per 1000

Selected Statistics for Major Sources of Spoken and Written Language (Sample Means)

Page 16: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

• A student in the 20th percentile reads books ______ minutes a day.

• This adds up to _________words read per year.

• A student in the 80th percentile reads books ______ minutes a day.

• This adds up to __________ words read per year.

.7

21,000

1,146,000

14.2

Minutes PerDay

Words Read PerYear

PercentileRank

Books Text Books Text

98 65.0 67.3 4,358,000 4,733,000

90 21.2 33.4 1,823,000 2,357,000

80 14.2 24.6 1,146,000 1,697,000

70 9.6 16.9 622,000 1,168,000

60 6.5 13.1 432,000 722,000

50 4.6 9.2 282,000 601,000

40 3.2 6.2 200,000 421,000

30 1.8 4.3 106,000 251,000

20 0.7 2.4 21,000 134,000

10 0.1 1.0 8,000 51,000

2 0 0 0 8,000

Exposure to Print

20

Page 17: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

What “gap” do we want to close?What do we mean by proficient reading?

• We want students to close the gap and become proficient in reading comprehension.

• “Acquiring meaning from written text” (Gambrell, Block, & Pressley, 2002)

• “the process of extracting and constructing meaning through interaction and involvement with written language” (Sweet & Snow, 2002)

• “thinking guided by print” (Perfetti, 1985)

(Torgesen, 2003)

Page 18: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

22

Thinking Guided by Print

• In middle and high school, reading can be increasingly defined as “thinking guided by print.”

Page 19: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

23

What works for late elementary and intermediate level students?

• Learning to Read– Decoding and Word Study (“Phonics”)– Fluency

• Reading to Learn– Comprehension and Vocabulary– Content Engagement– Content Enhancement

Page 20: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

1. What research questions do you have about grade 3-6 students who are deaf or hard of hearing?

2. What are your students’ instructional needs and learning challenges with reading?

3.  What are your instructional challenges when teaching reading?

4.  What are some things that you do instructionally that seems to work well?

Discussion Guide:

24

Page 21: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Assessment

• Assessment tells you what to teach• Assessment tells you where you’re going• Assessment tells you how to teach and

what instructional adjustments to make so teaching is more effective

• Assessment tells you when you get there• Assessment tells you what to do next• Assessment is instruction

25

Page 22: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standards-Aligned

• Diagnostic Assessment– Ascertain, prior to instruction, each student’s

strengths, weaknesses, knowledge, and skills. Using diagnostic assessments enables the instructor to remediate students and adjust the curriculum to meet the student’s unique needs.

• Formative Assessment (Progress Monitoring)– Allows teachers to monitor and adjust their

instructional practice in order to meet the individual needs of their students. They key is how the results are used. The results should be used to shape teaching and learning.

26

Page 23: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Diagnostic Assessment

• Individually administered tests designed to determine specific academic areas of strength and weakness (in some cases a detailed error analysis can be provided)

• Instructional levels can be determined• Assist with instructional planning and

educational goals• Assist in determining areas for future

assessment and progress monitoring28

Page 24: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

29

Stages of Learning

Entry Level

Initial Acquisition

Advanced Acquisition

ProficiencyMaintenance Generalization

Problem Solving

0-25%

65-80%

(Rivera & Smith, 1997)

Page 25: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Guidelines for Selecting a Diagnostic Assessment

• Aligned with the core content of reading (“Big Ideas” of Reading)

• Provides grade-level and instructional-level information

• Provides diagnostic/instructional profile, error analysis, helps determine what to teach

• Has documented technical adequacy (reliable and valid)

30

Page 26: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Guidelines for Selecting a Diagnostic Assessment• Items are developed from specific

performance objectives directly linked to an instructional domain

• The score is based on an absolute, not a relative standard

• The test measures mastery by using specific standards

• The focus is criterion-referenced evaluation: what the student can do and cannot do on specific skills and knowledge tasks.

31

Page 27: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

What it looks like. . .

Items Missed:On the Word Attack subtest, the long a-e patter in nonsense words -gaked, straced; the long I-e pattern in nonsense word -quiles

Deficit Skill:Decoding words with the long vowel-consonant-silent e pattern

Probe:Decoding words orally to teacher -cake, make, snake, rate, lake, fake, like, bike, kite

Criterion:-Decode 10/10 words for mastery-Decode 8/10 words to 6/10 words for instructional level-Decode 5/10 words or less for failure level; assess prerequisite skill level: discrimination of long/short vowels (vowels: a, i) 32

Page 28: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Test Construction Considerations

Must consider. . .

(1) Which specific tasks should be included

(2) How performance should be judged as mastered or not mastered

33

Page 29: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Setting Standards

(1) Judgment of Test Content

(2) Judgment of Individual Test Takers

(3) Assessing Error Rates

34

Page 30: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Judgment of Test Content

Standard setters (judges) analyze test items to determine how many should be passed to reflect minimal proficiency --consider the “borderline” test taker

35

Page 31: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

• Based on the guidelines just presented, consider what diagnostic assessment you could administer to your students.– Do you currently use a diagnostic

assessment? If so, which one? Does it meet the criteria we just discussed?

– What are the challenges of using diagnostic assessments with your students?

36

Discussion Guide:

Page 32: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Woodcock Reading Mastery Test-Revised• Purpose:

• Norm-referenced assessment that provides diagnostic information for instructional decision making

• Content:• Letter identification, word identification, word

comprehension (antonyms, synonyms, analogies), general reading vocabulary, science-mathematics vocabulary, social studies vocabulary, humanities vocabulary, passage comprehension

• Evaluation: • BOY October/November• EOY March/April

37

Page 33: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Appropriate for: Grades K-16ages 5 years 0 months through 75

Time – 10-30 minutes for each cluster of individually administered tests

Page 34: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reliability

Internal reliability • Tests median = .91 (range .68

to .98) • Clusters median = .95 (range .87

to .98) • Total median = .97 (range .86 to .99)

Page 35: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Total Reading Full Scale

ReadinessVisual Auditory

Learning

Letter Identification

Basic Skills

Word identification

Word Attack

ComprehensionWord

comprehensionPassage

comprehension

Page 36: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

ReadinessForm G Only

• Visual Auditory learning– A task to determine if the student can

associate symbols with words– Tests memory, attention, grouping of

word parts (i.e., ing with verbs)• Letter identification

– Alphabet recognition– Different fonts– Print and cursive

Page 37: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Basic Skills

• Test 3: Word identification– Reading words– Begins with one word on a page and

advances to multiple words– 106 items in increasing difficulty– The student does not need to know what

any of the words mean– Average score for a kindergarten student

is 1– Average score for a student in 12th grade

is 96

Page 38: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Basic Skills

• Test 4: Word Attack• Reading two types of words

– Nonsense words– Words with very low frequency usage

• Measures the ability to apply phonic and structural analytic skills

• Training is provided so the student will know how to approach the test

Page 39: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Comprehension

• Test 5: Word Comprehension– 3 subtests– Each begins with sample items– Training continues until competes the

item correctly.

• Subtest 5A: Antonyms– Measures ability to read a word and

respond orally with a word opposite in meaning

Page 40: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Comprehension

• Subtest 5B: Synonyms– Comprehension of reading vocabulary– Read a word and state another word

similar in meaning– Synonyms are “a more difficult cognitive

processing task than Antonyms.” p. 7

Page 41: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Comprehension

• Subtest 5C: Analogies– Read a pair of words; – ascertain the relationship, – read the first word of the second pair, – use the same relationship to supply a

word to complete the analogy

• Demonstrates content embedded word knowledge

Page 42: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Comprehension Reading Vocabularies

• General reading • Science-mathematics • Social studies• Humanities

Page 43: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Comprehension

• Test 6: Passage Comprehension• Modified cloze procedure

– Short passage with a blank line– Student supplies a word that “fits” in the

blank– The first 1/3 of the passage are one

sentence long and have a picture related to the text

Page 44: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Materials

• Examiner protocol• Test record booklet• Stimulus book/easel pages for

student• Clipboard• Pencil

49

Page 45: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Administration

• Administer:– Test 3: Word Identification– Test 4: Word Attack (if possible)– Test 5A: Word Comprehension (Antonyms subtest)– Test 5B: Word Comprehension (Synonyms

subtest)– Test 5C: Word Comprehension (Analogies Subtest)

• The test battery will take an experienced tester about 45 minutes

• Test by complete pages

Page 46: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

• Start at the points indicated in the tables in the test easel

• If the student is correct on the first 6 items, a basal is established.

• If less than 6 are correct, go back a page and administer the whole page.

• Continue to test backwards starting with the first item on a page until the first 6 on a page are correctly answered

Basal Rules

Page 47: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Ceiling Rules

• 6 or more consecutively failed items that end with the last item on a test page.

• See page 22 for an example of basal and ceiling scoring

Page 48: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Identification

• MUST know how to pronounce the words in the test (p. 28-29)

• A table of suggested starting points is provided in the easel– If the student does not respond to the first item,

score it 0 and say the word and ask the student to repeat it

• NO OTHER WORDS WILL BE READ TO THE STUDENT

• WRITE what the student said for incorrect responses

• Write comments the student says

Page 49: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Attack

• If the student scores 0 or 1 on the word identification, a score of 0 can be recorded for Word Attack – (For our practice, don’t do this)

• Begin with the 2 sample items; then proceed to item 1

• Study the pronunciation guide (p. 28-29)• The student must answer within 5 seconds• The “word” must be read naturally –not

sounded out for the final reading• WRITE what the student says

Page 50: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Comprehension

• For all three subtests, the student reads the item aloud and responds orally

• Only single word responses are acceptable

• Mispronunciations are not errors• WRITE what the student says• Begin with the practice item in each

subtest

Page 51: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring

• Score as you administer the test• Score 1 or 0 by the item• Write any comments and erroneous

responses• Raw score is the sum of correct

responses; plus 1 point for every item below the basal

Page 52: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring Word Comprehension

• Antonyms and Synonyms combined score– Calculate the score for each subtest– Add them– Convert this raw score to a part score– Record in the box labeled 5A+5B part score– Covert the Analogies raw score to a part

score– Sum both part scores for a Word

comprehension W score

Page 53: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Reading Vocabularies

• Count the correct responses for the Test 5 subtests

• The designation for each response is coded on the test record– G general reading– SM science and math– SOC social studies– H humanities

Page 54: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Levels of Interpretive Information

• Level 1: Analysis of errors• Level 2: Level of development• Level 3: Quality of Performance• Level 4: Standing in a Group

59

Page 55: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Level 1: Analysis of Errors

• Individual item responses• Descriptive of a subject’s

performance on precisely defined skills

60

Page 56: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Level 2: Level of Development

• Sum of item scores• Raw score• Rasch ability score (test W score,

subtest part score, cluster W score)• Grade equivalent• Age equivalent

61

Page 57: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Age and Grade Calculations

• Age is standard- use the AGS calculator if you wish

• Grade placement is by tenths of the school year– See table in the test protocol or on page

32

Page 58: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Chronological Age

Years Months Days• Date of Test: 2000 11 22• Date of Birth 1992 04 03• Chronological Age 8 7 19

8-8*If the number of days exceeds 15, round up

to the nearest month. If the number of days is 15 or less, round down to the nearest month.

63

Page 59: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Chronological Age

Years Months Days• Date of Test: 2000 02 17• Date of Birth 1990 08 05• Chronological Age 9 6 12

9-6*If the number of days exceeds 15, round up

to the nearest month. If the number of days is 15 or less, round down to the nearest month.

64

1999 14

Page 60: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Chronological Age: Your Turn

Years Months Days• Date of Test: 2000 05 15• Date of Birth 1989 09 24• Chronological Age ______ ___

____*If the number of days exceeds 15, round up

to the nearest month. If the number of days is 15 or less, round down to the nearest month.

65

Page 61: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Raw Scores

The number of items the student has answered correctly or incorrectly on a given test.

Calculation:

(1) Count the number of correct test items

(2) Divide the number of correct items by the total number of test items to obtain the percentage correct

Page 62: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Raw Scores

When to Use:• Raw score is a starting point for all norm-referenced scores

• Only appropriate when comparisons to other students or other (nonalternate form) tests are not needed

• Only way raw score can be used is in reference to criterion or individual-referenced evaluations, not norm-referenced ones

• Raw scores can provide a better basis for interpretation when they are summarized as percentages (e.g., Summarizing a raw score as 90% correct is more informative than stating that a student had 75 items correct).

Page 63: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Raw Scores

Advantages and Disadvantages:• Advantages

• We can express them as the number or percentage correct

• Can be used to measure mastery or improvement

• Disadvantages

• Limited interpretability --it is not possible to use them to compare performance across time, students, tests, or content

Page 64: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Grade- and Age-Equivalent Scores

Grade and age-equivalent scores express the student’s performance developmentally in terms of a corresponding age or grade level

• Usually, age scores are reported in years and months

If Jan, who is 10 years and 3 months old, has an age score of 12-5, her performance theoretically is the same as child who is 12

years and 5 months old.

• Grade scores are reported in grade levels to the nearest tenth, which corresponds to academic months

If Jack, a 4th grader, has a grade equivalent score of 6.1, he is performing at the 6th grade, first month level.

Page 65: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Grade- and Age-Equivalent Scores

When to Use:• The American Psychological Association (APA) and the National Council on Measurement (NCME) have recommended against the use of grade- and age-equivalent scores in making any educational decisions

Page 66: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Grade- and Age-Equivalent Scores

Advantages and Disadvantages:Critical disadvantages outweigh their simplistic advantage:

(1) Both scores, especially grade scores, are based upon the assumption that learning occurs consistently across the year, which has not been proven

(2) We cannot say with accuracy that a 4th grader with a grade score of 6.1 performs like all 6th graders in their first month

(3) Age- and grade-equivalent scores are not measured in equal interval units; therefore they cannot be added or subtracted to examine improvement

(4) Age- and grade-equivalent scores are often derived by extrapolating and interpolating from normative data

Page 67: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Level 3: Quality of Performance

• Performance on a reference task• W-difference score (DIFF);

Instructional range; Relative performance index (RPI)

72

Page 68: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Level 4: Standing in a Group

• Deviation from a reference point in a group

• Rank order• Standard score• Percentile rank

73

Page 69: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro
Page 70: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Percentiles

•Percentile – point in a distribution at or below which a percent of the observations lie–For example: 10% of observations lie at or below the 10th percentile.

Page 71: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Percentiles

• The term percentile refers to the percentage of individuals in a fixed standardization sample with equal or lower scores

• Percentile rank represents the area of the normal curve, expressed as a percentage, that scores below a certain value

If Sue’s raw score of 13 has a percentile ranking of 85, then 85% of the population upon which the test is based, scored at or below 13; 15% of

the standardization sample scored above 13.

• Percentiles range from 1 to 99, never 0 or 100

• The 50th percentile is equal to the median

• For increased accuracy, percentiles may be reported in decimals, so some test may range from .1 to 99.9.

Page 72: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standard Scores

• Standard scores represent a linear transformation of raw scores into standard deviation units.

• Standard score reflects the student’s standing relative to others in the distribution on the basis of variation

• Translating raw scores into a set of equal interval, standard scores means that there will always be a consistent mean and standard deviation

• Three types:

(1) Z-Score(2) T-Score(3) Standard Scaled Scores

Page 73: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standard Scores: Z-Scores

• Z-Score transformation changes raw scores into deviation units, where the group or test mean is equal to 0.0 and the standard deviation is 1.0.

• Z-Score is a measure of the number of standard deviation units away from the test mean

• Important interpretative indices: (a) the sign (+ or -) and (b) the size of the score [the greater the score, the more it is below or above the mean]

x - X ----------

SD

x = raw scoreX = meanSD = standard deviation

Page 74: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standard Scores: T-Scores

• T-Score transformation takes raw scores and changes them to equal interval units, where the mean is 50 and the standard deviation is 10.

• Once a raw score is converted to a z-score, the teacher multiplies each score by 10 then adds 50.

T = 10z + 50

• Virtually all T-scores are positive since it would take a z-score of less than -5.0 to convert to a T-score less than zero.

Page 75: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standard Scores:Standard Scaled Scores (SS)

• With standard scaled scores (SS), raw scores are transformed to a scale where the mean is 100 and the standard deviation is 15

• The standard scaled score has become popular on recent tests of intelligence and achievement

SS = 15z + 100

Page 76: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Comparison of raw scores, z-scores, T-scores, and standard scores (Tindal & Marston, 1990, p. 340)

Page 77: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standard Scores

When to Use:• Can be used to summarize student performance on a range of different measures because they place all measures on a common scale

• If we want to perform arithmetic operations on the measures (e.g., the pretest performance is to be subtracted from the posttest performance), then we must use standard scores

• If we want to identify a student’s real position in the distribution, then we must use standard scores

Page 78: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Standard Scores

• All three types of standard scores are equal interval units so they may be added, subtracted, and arithmetically transformed.

• Can show at a glance how far a student is from the mean and his or her position on a normal distribution

• When scores from different tests are compared to each other, standard scores are scale-free and, therefore, can be directly compared

• Can be used to examine absolute changes in relative performance, we can calculate them by subtracting the pre- from the posttest score

• Can be used to determine growth and can reflect improvements in relative standing (this cannot be done with raw scores)

Page 79: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Profiles and Error Analysis

• Instructional Level Profile• Percentile Rank Profile• Diagnostic Profiles• Word Attack Error Analysis

84

Page 80: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

86

Individual-Referenced Evaluation

• Measures of student performance over time and comparison of successive values to previous values

• Progress Monitoring!

Page 81: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

87

Components

• Direct measurement• Repeated measurement• Time-series graphic displays• Data utilization and decision rules

Page 82: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

88

Interpreting Data- Making Instructional Decisions

• Summarizing Performance• Decision-Making

– Goal-oriented decisions– Intervention-oriented decisions

Page 83: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

• Based on the guidelines just presented, consider what individual-referenced evaluation/progress monitoring measures you currently administer to your students (3-6).– What assessments, if any, do you currently use?

• What works/what doesn’t work (e.g., what are some of the challenges with the assessments you use)?

• How do you use the assessment information? What kinds of instructional decisions do you make?

– If you don’t currently use individual-referenced evaluation/progress monitoring measures, what types of progress monitoring measures would be helpful to your instruction?

89

Discussion Guide:

Page 84: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

90

MAZE

Page 85: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

MAZE

• Multiple choice• Cloze task• 150-400-word grade-level passage• First sentence intact• Thereafter, every 7th word replaced

with 3 words inside parentheses• 2.5 minutes• Group administered

92

Page 86: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Grade Level Level WCPM Errors Materials 1-2 Frustration <40 >4

Instructional 40-60 4 or lessMastery >60 4 or less

3-6 Frustration <70 >6Instructional 70-100 6 or lessMastery >100 6 or less

Target Levels for Passage-Level Placement

93

Page 87: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Materials

• 2 sources– Pro-Ed Macintosh-platform

•Administered, scored, maintained on Macintosh

– Aimsweb•Graded passages available for

purchase•Data storage and graphing services

also available

94

Page 88: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Aimsweb Sample Passage

• Once upon a time there was a merchant whose wife died leaving him with three daughters. The two older daughters were good-looking (but, stand, then) very disagreeable. They cared only for (until, themselves, himself) and for their appearance; they spent (palace, wicked, most) of the time admiring their reflections (in, of, turned) a looking glass.

95

Page 89: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Materials

• Student copy of MAZE probe• Examiner copy of MAZE probe• Stopwatch (time for 3 minutes)• Pencil/Pen• Ruler or marking card to help

children with line tracking (optional)

96

Page 90: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

• Tell students to read the passage, select the correct word from among the 3 options

• Decide if practice test is needed• Time for 3 minutes • Collect the MAZE probes

97

Page 91: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Accommodations

• Retest (different day, different probe, different conditions)

• Alternate setting• Check for understanding of directions• Explicit instruction (comprehension

strategies) + retest• Practice activities, “mini lessons,”

use MAZE as lesson warm-up98

Page 92: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Accommodations

•Enlarge print•Reduce amount of text on page

•Color code and highlight choices

•Ruler99

Page 93: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Have you ever had nothing to do? Sometimes when I have nothing to (do/can/so), I take a walk. That’s when (to/I/do) kick stones. I look for cans (to/bet/kit) kick.

100

Page 94: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Have you ever had nothing to do? Sometimes when I have nothing to (________), do can so

101

Page 95: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Administration

• Practice administering the MAZE with a partner. Role play the roles of teacher and student. Follow the administration guidelines as you practice.

• Evaluate fidelity by using the “Checking Out Accuracy in Test Administration” checklist.

102

Page 96: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring

• Total number correct in 3 minutes• Total number of errors in 3 minutes• % Accuracy in 3 minutes

( = total correct / total completed )

• Students receive 1 point for each correctly circled answer. Blanks and no circles are counted as errors. Scoring is discontinued if 3 consecutive errors are made. The number of correct answers within 3 minutes is the student score.

103

Page 97: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Kicking Stones

Have you ever had nothing to do? Sometimes when I have nothing to (do/can/so), I take a walk. That’s when (to/I/do) kick stones. I look for cans (to/bet/kit) kick. If I can’t find any (you/cans/pal) to kick, I just kick stones. (He/I/Be) look for big stones to kick. (Be/I/Me) walk down the road kicking one (kiss/stone/from) after another. This means I have (wishful/nothing/pressed) else I can think of dong.

104

Page 98: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Prorating

• Some students may finish all of the items before the 3 minutes is up. Remember to prorate the score.

• See AIMSWeb Scoring Directions

105

Page 99: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Scoring

• Practice scoring the MAZE. Determine MAZE scores for Rick and Dave. – Total number correct– Total number of errors– % Accuracy

( = total correct / total completed )

106

Page 100: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Alternate Scoring

• Practice scoring the MAZE. Determine Juan’s MAZE score using the 3-consecuitve errors approach.– Total correct (w/scoring discontinued

prior to 3 consecutive errors):

107

Page 101: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Alternate Scoring

• ANSWER– Juan circled 16 correct answers in 2.5

minutes. He circled 7 incorrect answers. However, Juan did make 3 consecutive mistakes, and 5 of this correct answers were after his 3 consecutive mistakes. Juan’s score on the MAZE would be 10.

108

Page 102: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

MAZE Benchmarks

• Typical goal: 0.4 correct choices per week

• Ambitious goal: 0.7 correct choices per week– (~18 words in 25 weeks)

• Analyze data after– At least 6 scores and/or at least 3 weeks

• Make instructional change if– Most recent 4 scores are all below goal line

(Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993) 109

Page 103: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

MAZE Benchmarks

Grade• First Grade• Second Grade• Third Grade• Fourth Grade• Fifth Grade• Sixth Grade

Spring Benchmark (at or above 85% accuracy)

• 8 correct• 8 correct• 13 to 15• 20 correct replacements per 2.5

minutes• 25 correct replacements per 2.5

minutes• 30 correct replacements per 2.5

minutes

110

Page 104: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Guidelines and Decision Rules

• Move up to the next instructional level on the MAZE when students receive a score at or above the target benchmark on 3 consecutive probes. – NOTE: Use instructional judgment. If students

are not getting 85% or above accuracy, continue until errors are reduced.

111

Page 105: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Guidelines and Decision Rules

• Administer 2 x per month if the MAZE is the only comprehension progress monitoring measure used

• Administer 1 x per month if another comprehension monitoring measure is used along with the MAZE (e.g., www.easycbm.com; multiple choice comprehension)

112

Page 106: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Look at the Data!

MAZE - Second Grade

02468

10121416

1 2 3 4 5 6

Assessment

Sc

ore

Student 1

Student 2

Student 3

113

Page 107: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Some Difficulties Revealed Through Assessment

Decoding Word Type/Use Vocabulary Identifying the main idea Distinguishing between relevant & extraneous

information Following the sequence of information Summarizing information presented Making inferences/applying concepts presented

beyond the scope of a particular passage

114

Page 108: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Linking Assessment and Instruction: Error Analysis

115

Page 109: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Linking Assessment and Instruction

• Use the MAZE Error Analysis to determine if there are any error patterns with Juan’s MAZE.– What, if any, error patterns do you note?– Based on your analysis, what

instructional intervention could be used to help Juan improve his reading comprehension?

116

Page 110: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Linking Assessment and Instruction

• Look at the MAZE scores of two third-graders, a typically performing student, and a low performing student. What do you observe?– What differences did you observe in Emma’s

and Abby’s MAZE performance?– What other conclusions can you draw? – What interventions would help improve

Emma’s and Abby’s performance? Identify the intervention and the strategy/skill the intervention will address.

117

Page 111: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

118

Multiple Choice

Reading Comprehension

http://www.easycbm.com

Page 112: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Materials

• Student copy of probe– http://www.easycbm.com/

• Students can also take online• Untimed

120

Page 113: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

121

Page 114: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

122

Page 115: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

• Please read the story and answer the questions that come after it.

123

Page 116: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Getting Passages

124

Page 117: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Entering a Student Score

125

Page 118: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Entering a Student Score

126

Page 119: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Viewing a Score Report

127

Page 120: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Viewing a Score Report

128

Page 121: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Viewing a Score Report

129

Page 122: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Sample Score Report

130

Page 123: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Sample Score Report

131

Page 124: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring

• Total number correct

132

Page 125: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Multiple Choice Comprehension Benchmarks

Grade• Second Grade• Third Grade• Fourth Grade• Fifth Grade• Sixth Grade

Spring Benchmark

• 10• 15• 15• 16• 16

133

Page 126: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Guidelines and Decision Rules

• Use 1 x per month (MAZE is also used 1 x per month)

• Move up to the next instructional level when students receive a score at or above the target benchmark on 3 consecutive probes.

134

Page 127: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Linking Assessment and Instruction

• Look at the Multiple Choice Comprehension materials. How could you use the information from the assessment for instructional decision making. What are some ways that you could analyze student errors?

135

Page 128: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Error Analysis

136

Page 129: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

137

Depth ofVocabulary Knowledge

Page 130: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Progress Monitoring

• DIBELS – LNF, WIF, OR MAZE /other comprehension– LNF: PreK-K– WIF: Grade 1– Maze: Grade 1-Grade 3 /other

comprehension• AND Depth of Knowledge Vocabulary (DOK)

138

Page 131: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Vocabulary Depth of Knowledge (DOK) Assessment

• Purpose:– Assess depth of vocabulary knowledge

• Definition Knowledge– No/Faulty Knowledge– Developing Knowledge– Accurate Knowledge

• Use Knowledge– No/Faulty Use– Basic Use– Complex Use

– Use diagnostically for instructional decision making

139

Page 132: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Importance of Vocabulary

• “Our knowledge of words. . .determines how we understand texts, define ourselves for others, and define the way we see the world.” – Stahl (1999)

• Expressive vocabulary at the end of first grade is a significant predictor of reading comprehension ten years later (Cunningham & Stanovich, 1997)

140

Page 133: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

What the research says:

• The relationship between reading comprehension and vocabulary knowledge is strong and unequivocal (Baumann & Kame’enui, 1991; Stanovich, 1986).

• Even weak readers’ vocabulary knowledge is strongly correlated with the amount of reading they do (Cunningham & Stanovich, 1998).

141

Page 134: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Materials

• Student NO PROBE!• Examiner copy of DOK

directions and probe• Scoring Rubric• Probe Definition Sheet• Pencil/Pen• Stopwatch/Timer• Clipboard (optional)• Administration Partner for Co-Administration

(optional)

142

Page 135: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Sample Probe

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

143

Page 136: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

When Do You Administer the DOK?

• Depends on your instructional purpose!• For progress monitoring, recommended

1x per month (minimum)– 2 x per month ideal

• Or, use as a pretest and posttest to assessment student knowledge of vocabulary from an instructional unit or series of lessons

144

Page 137: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

145

0 2 4 6 8 12 14 16 181020

30

40

50

60

70

80

90

100

Successive Days

Test Score

Example: Variability

145

Page 138: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Lists: K-1 and 2-3

• Probes (K-1 and 2-3) provided by the PaTTAN D/HH Promising Practices Project are based on words from the DIBELS word use fluency.– Words for each probe were randomly selected

from the pool of DIBELS word use fluency words.– Expert consultants with extensive experience

working with students who are deaf or hard of hearing, identified the final words used on the probes (e.g., considerations were made for signing, multiple use, etc.)

146

Page 139: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Lists: 3-4 and 5-6

• Probes (3-4 and 5-6) provided by the PaTTAN D/HH Promising Practices Project are based on words from A. Biemiller’s Words Worth Teaching (2010).– Words for each probe were randomly selected from

the pool of “Easy” words (4 words) and “Words Worth Teaching” words (6 words).

– Expert consultants with extensive experience working with students who are deaf or hard of hearing, identified the final words used on the probes (e.g., considerations were made for signing, multiple use, etc.)

147

Page 140: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Maintain High Expectations When Thinking About Word Selection• Consider sources like. . .

– A. Biemiller (2010). Words Worth Teaching: Closing the Vocabulary Gap. Columbus, OH: McGraw Hill/SRA.

– SAS (Pennsylvania's Standard-Aligned System)

– Your reading program, school/district curriculum, or instructional unit

148

Page 141: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Word Selection Guidelines

• Select taught and untaught• Select words that are critical to comprehension.• Select words that students are likely to encounter in

the future and are generally useful. • Select words that will give students a more precise

understanding of something they already know (e.g., plain-ordinary; rested-refreshed; shiny-glisten)

• Use the Goldilocks Approach (Stahl & Stahl, 2004): – Not too easy, not too difficult. . .just right.– Sometimes “just right” can be hard and challenging!

149

Page 142: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

Sign or say these specific directions to the student:“I’m going to ask you about some words. I’ll ask you to

tell me what each word means, then I’ll ask you to use the word in a sentence. For example, if I say ‘what does sad mean?’ you could say, ‘Sad is when you are not happy.’ If I say ‘use the word sad in a sentence’ you could say, ‘I was sad when my ice-cream fell on the floor.’”

“Now it’s your turn. What is a chair?”

CORRECT RESPONSE:If student gives a correct response, say:

INCORRECT RESPONSE:If student does not respond or gives an incorrect definition, say:

“Very good.” “A chair is something you sit in.”

150

Page 143: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

• “Now use the word ‘chair’.”

• “If you don’t know what a word means, or how to use a word, it is OK to say, ‘I don’t know.’”

• “OK. Here is your first word.” [Time for 15 minutes]• For each item, say “What does ______ mean?” or “What is a _____?”

After the student responds, say “Now use the word _____ .”

CORRECT RESPONSE:If student gives a correct response, say:

INCORRECT RESPONSE:If student does not respond or uses the word incorrectly, say:

“Very good.” “I sat in my chair all day at school.”

151

Page 144: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

3. During directions, you may also give additional examples to help explain definition and use. Do not take more than 5-minutes to present additional examples.

4. Give the student the first word and start your stopwatch.

5. Minor Prompts/Major Prompts

152

Page 145: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Major Prompts

• Definition: “Remember, tell me what the word _______ means. Describe the word the best that you can.”

• Use: “Now use the word. Use the word in a sentence or try to use the word as you communicate.”

• Use: “Put the word into your sentence, like this [visual]. Use the word, _______, with some of your own words.”* You may use a major prompt for both definition and use 2x during the assessment (e.g., 2 total major prompts for definition and 2 total major prompts for use across all items).

153

Page 146: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Minor Prompts

• Definition: “What does ____ mean?”• Definition: “Now use the word______.”• Use: “Now use the word ______.”

There is no limit on minor prompts. Use as many as needed per item.

154

Page 147: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Sample Probe

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

155

D: student response [D-mp1]U: student response

D: student response [D-mp2]U: student response

D: student responseU: student responseD: student responseU: student responseD: student responseU: student response [U: mp1]

D: student responseU: student response

D: student responseU: student responseD: student responseU: student response

D: student responseU: student response

Page 148: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

6. Provide the next word promptly or when the student hesitates or pauses for 5 seconds.

7. Record the student’s response in the space provided. Use “NR” for “no response,” and “DK” for “I don’t know.”

8. For each word, it’s okay to elicit (sign) multiple meanings. If the word has several meanings, try to prompt for just two to three alternate meanings.

9. If the student gives a partial or ambiguous definition, follow-up by signing or saying, “Tell me more about what _____ means” or “Tell me more about a ______.”

156

Page 149: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Follow-up PromptsFollow-up prompts may be used once to

clarify the definition and use for each item.

157

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

D: student response [D-mp1] FU: student response F student response

Page 150: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Directions

10. Discontinue Rule: If a student has not given any correct response for the first 4 words, discontinue the task. A student would score a 0 for the assessment.

11.Encourage response with a neutral praise (Examples: I like how hard you are working. Good job with your thinking.). If the student becomes frustrated, tell them it’s okay if they don’t know all of the words.

12. If a student acts out or dramatizes the word (e.g., acts out or dramatizes a word like “snore” without signing,) prompt the student by saying, “Tell me what ______ means using words/sign. (If the student is still not able to provide the definition in words, write “acted out” on the score sheet.)

13. If the student begins to ramble or becomes off task, redirect the student back to the task.

158

Page 151: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Administration

• Practice administering the DOK with a partner. Role play the roles of teacher and student. Follow the administration guidelines as you practice.

159

Page 152: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

160

Page 153: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring

• Definition– No/Faulty Knowledge = 0

points– Developing Knowledge = 1

point– Accurate Knowledge = 2

points

• Use– No/Faculty Use = 0 points– Basic Use = 1 points– Complex Use = 2 points

161

Page 154: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Alternate Definitions and Use

• Definitions– If a student provides more

than one definition of a word, score each alternate definition.

– For example, if a student defines “fly” as an insect and as “an aircraft flies,” score each of the definitions using the No/Faulty Knowledge-Developing Knowledge-Accurate Knowledge criteria.

– Add BOTH definition scores for the word’s overall Define Score.

• Use– If a student provides more than

one contextual use of a word, score each alternate use.

– For example, if a student uses “fly” as an insect and “an aircraft flies” in context (e.g., two different sentences) score each of the uses using the No/Faulty Use-Basic Use-Complex Use criteria.

– Add BOTH use scores for the word’s overall Use Score.

162

Page 155: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Examples

• Target word: “coat”– (1) A piece of clothing with long sleeves which

you wear over your clothes when you go outside (2) An outer covering of an animal (3) A thin layer of a substance (coat of paint)

• Student response:– “A coat is a jacket. I have a really thick coat that

I only wear when it’s really cold.”• Definition Score:• Use Score:• Total Score:

1

23 163

Page 156: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Examples

• Target Word: “proud”– (1) If you feel proud, you feel pleased

about something good you have done, or about something good that a person close to you has done (2) Someone who is proud who ahs respect for themselves (3) Someone who is proud feels that they are better or more important than other people

164

Page 157: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Examples

• Student response:– “Proud is when you feel good. I am proud.”

• Definition Score:• Use Score:• Total Score:

1

1

2

165

Page 158: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Examples

• Student response:– “Proud is when you feel really good about

something you did. I felt proud when I won the race.”• Definition Score:• Use Score:• Total Score:

2

2

4

166

Page 159: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Scoring Tips

Word: Coat

167

Define

Use

0 point responses 1 point responses 2 point responses

0 point responses 1 point responses 2 point responses

Page 160: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Scoring

• Practice scoring the DOK.

168

Page 161: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Guidelines and Decision Rules

• Purpose:– Diagnostic instructional decision making,

progress monitoring

• Content:– Depth of vocabulary and word knowledge

• Implemented:– 1 x per month for progress monitoring

• Also okay to use 2 x per month for progress monitoring and/or as a pretest-posttest for instructional units.

169

Page 162: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Look at the Data!

DOK - Total by Student - Kindergarten (2009)

0

5

10

15

20

25

30

35

Nov Dec Jan Feb

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

170

Page 163: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Look at the Data!

DOK - Total by Student - Kindergarten (2009)

0

5

10

15

20

25

30

35

Nov Dec Jan Feb

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

171

Page 164: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Look at the Data!

DOK - Total by Student - Kindergarten (2009)

0

5

10

15

20

25

30

35

Nov Dec Jan Feb

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

172

Page 165: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Linking Assessment andInstruction

• Defining Knowledge– None– Developing– Accurate

• Use (Contextual Knowledge)– None– Basic– Complete

• Define vs. Use• Alternate Understanding and Use (Multiple

Meanings)• Fluency

– Timed vs. Un-timed• Word Types

– Nouns, verbs, etc.– Levels

173

Page 166: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Practice: Linking Assessment and Instruction

• Look at the sample DOK probes for the two students in Grades 2-3. The probe has the words reader, afraid, lady, remember, nut, etc.

• What instructional decisions can you make based on the students’ DOK responses? What will you teach?

• Do students seem to have a more challenging time with word use, definitions, both? How will this inform your instruction?

174

Page 167: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Conclusions

• Assessment tells you what to teach• Assessment tells you where you’re going• Assessment tells you how to teach and

what instructional adjustments to make so teaching is more effective

• Assessment tells you when you get there• Assessment tells you what to do next• Assessment is instruction

175

Page 168: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

• What are the most common reading difficulties that you observe with your students? What are some of the biggest challenges you have with reading instruction?

• What questions, if any, do you have about instructional interventions for students in grades 3-6? What areas of instruction would be most helpful to you?

• Do you currently use any instructional strategies or interventions that work well? If yes, what are some of these strategies/interventions?

177

Discussion Guide:

Page 169: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Assignment

• Use the assessments discussed today with your students. Assessment should include:– Diagnostic Assessment (your choice)

• Administer 1x in the fall

– Reading Progress Monitoring Assessment• Administer 2x per month

– Depth of Vocabulary Knowledge Assessment• Administer 1 x per month

• Bring your student data to the next training.• See you on November 18th for our Instructional

Interventions training!

178

Page 170: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Any Questions?

• Lana Edwards Santoro, Ph.D.Research Associate and Project Research Consultant

University of Oregon, Center for Teaching and Learning

Alexandria, VA 22310(703) [email protected]@pacificir.org

(NOTE: email will be changing to “uoregon” account)

179

Page 171: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Contact Information www.pattan.net

Marlene [email protected]

1-800-446-5607; X6862

Sue Ann [email protected]; X7243

Jane [email protected];

X3106

Commonwealth of PennsylvaniaEdward G. Rendell, Governor

Pennsylvania Department of EducationGerald L. Zahorchak, D.Ed., Secretary

Diane Castelbuono, Deputy SecretaryOffice of Elementary and Secondary

Education

John J. Tommasini, DirectorBureau of Special Education

Patricia Hozella, Assistant DirectorBureau of Special Education 180

Page 172: Part I :  Assessment September 14, 2010 PaTTAN Lana Santoro

Edward G. Rendell Gerald L. Zahorchak, D.Ed. Governor Secretary

Diane Castelbuono, Deputy SecretaryOffice of Elementary and Secondary Education

John J. Tommasini, DirectorBureau of Special Education

Bureau of Special EducationPennsylvania Training and Technical Assistance Network

Contact Information: Name of Consultant, Email addresswww.pattan.net