Part I : Assessment September 14, 2010 PaTTAN Lana Santoro

Post on 04-Jan-2016

37 views 14 download

Tags:

description

Promising Practices to Improve Reading Performance of Students who are Deaf or Hard of Hearing: Grades 3-6 Pilot Research Project. Part I : Assessment September 14, 2010 PaTTAN Lana Santoro. Agenda. Our Research Pilot with Grades 3-6 Diagnostic Assessment - PowerPoint PPT Presentation

Transcript of Part I : Assessment September 14, 2010 PaTTAN Lana Santoro

Pennsylvania Training and Technical Assistance Network

Promising Practices to Improve Reading Performance of Students who are Deaf or Hard of Hearing: Grades 3-6 Pilot Research Project

Part I: Assessment

September 14, 2010PaTTAN

Lana Santoro

Agenda

• Our Research Pilot with Grades 3-6• Diagnostic Assessment• Reading Progress Monitoring

Assessment• Depth of Vocabulary Knowledge• Next Steps

2

Learning Community and Study Group Approach

• Collective Participation• Shared Inquiry• Formative Evaluation• Action Research

7

Sample Information

2008 2009

K

(n=8)

1

(n=4)

2

(n=1)

3

(n=6)

Total Sample (n=19)

K

(n=7)

1

(n=13)

2

(n=3)

3

(n=2)

Total Sample (n=25)

Demographics:

White (Non-Hispanic)

7 3 1 5 16 6 11 2 2 21

Hispanic 1 1 0 0 2 0 1 1 0 2

Black/African American

0 0 0 1 1 0 0 0 0 0

Asian/Pacific Islander

0 0 0 0 0 1 1 0 0 2

Gender:

Male 2 0 1 3 6 2 2 0 1 5

Female 6 4 0 3 13 5 11 3 1 20

Beginning Reading SkillsKindergarten: LNF

LNF - Mean Scores - Kindergarten

0

10

20

30

40

50

60

70

80

90

Fall Winter Spring

2008

2009

DIBELS goal

PA school

Beginning Reading SkillsFirst Grade: WIF

WIF - Mean Scores - First Grade

0

10

20

30

40

50

60

70

80

90

Fall Winter Spring

2009

Goal

Reading Connected Text - FluencyFirst-Third: TPRI

First Second Third Fluency Rate - First Grade (2009)

0

20

40

60

80

100

120

BOY EOY

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

Student 8

Student 9

Student 10

Student 11

Student 12

Student 13

Mean

Goal

Fluency Rate - Second Grade (2009)

0

10

20

30

40

50

60

70

80

90

100

BOY EOY

Student 1

Student 2

Student 3

Mean

Goal

Fluency Rate - Third Grade (2009)

0

20

40

60

80

100

120

140

BOY EOY

Student 1

Student 2

Mean

Goal

Reading Connected Text – ComprehensionSecond: MAZE

Second Grade Spring Outcome Gain per Week Gain per Month Benchmarks 8 .23 .88

2008 7.33 .20 .84

2009 11.11

NA NA

MAZE - Mean Scores - Second Grade

0

2

4

6

8

10

12

Fall Winter Spring

2008

2009

Goal

Slope (2008)

Slope (2009)

Reading Connected Text – ComprehensionThird: MAZE

Third Grade Spring Outcome Gain per Week Gain per Month Benchmarks 13 .26 1.12

2008 9.50 .40 1.15

2009 12.00

.28 1.22

MAZE - Mean Scores - Third Grade

0

2

4

6

8

10

12

14

16

18

20

Fall Winter Spring

2008

2009

Goal

Slope (2008)

Slope (2009)

Reading Connected Text – ComprehensionThird: MAZE

Third Grade Spring Outcome Gain per Week Gain per Month Benchmarks 13 .26 1.12

2008 9.50 .40 1.15

2009 12.00

.28 1.22

MAZE - Mean Gains - Third Grade

0.00

0.20

0.40

0.60

0.80

1.00

1.20

1.40

2008 0.40 1.15

2009 0.28 1.22

Gain/week Gain/month

Goal: .26/week

Goal: 1.12/month

Reading Simulation

• Find a partner.• Decide who will be the teacher

(examiner) and student.• Implement the reading assessment

as directed.

15

16

What do Skilled Readers do?

• Skilled readers differ from unskilled readers in “their use of general word knowledge to comprehend text literally as well as to draw valid inferences from texts, in their comprehension of words, and in their use of comprehension monitoring and repair strategies.” (Snow, Burns, & Griffin, 1998, p. 62)

17

What we Know About the Factors that Impact Reading Comprehension

• Accurate and fluent word reading skills.

• Language skills (receptive and expressive vocabulary, linguistic comprehension)

• Extent of conceptual and factual knowledge

• Knowledge and skill in use of cognitive strategies to improve comprehension or repair it when it breaks down

• Knowledge of text structure and genre

• Reasoning and inferential skills

• Motivation to understand and interest in task and materials

18

What Reading Does for the Mind

• Reading comprehension requires knowledge –of words and the world.– E.D. Hirsch, American Educator (Spring

2003)

19

PRINTED TEXTSAbstracts of scientific articles 4389 128.0Newspapers 1690 68.3Popular magazines 1399 65.7Adult books 1058 52.7Comic books 867 53.5Children’s books 627 30.9Preschool books 578 16.3

TELEVISION TEXTSPopular prime-time adult shows 490 22.7Popular prime-time children’s shows 543 20.2Cartoon shows 598 30.8Mr. Rogers and Sesame Street 413 2.0

ADULT SPEECHExpert witness testimony 1008 28.4College graduates to friends, spouses 496 17.3

(Adapted from Hayes and Ahrens, 1988)

Rank of Median Word

Rare Words per 1000

Selected Statistics for Major Sources of Spoken and Written Language (Sample Means)

• A student in the 20th percentile reads books ______ minutes a day.

• This adds up to _________words read per year.

• A student in the 80th percentile reads books ______ minutes a day.

• This adds up to __________ words read per year.

.7

21,000

1,146,000

14.2

Minutes PerDay

Words Read PerYear

PercentileRank

Books Text Books Text

98 65.0 67.3 4,358,000 4,733,000

90 21.2 33.4 1,823,000 2,357,000

80 14.2 24.6 1,146,000 1,697,000

70 9.6 16.9 622,000 1,168,000

60 6.5 13.1 432,000 722,000

50 4.6 9.2 282,000 601,000

40 3.2 6.2 200,000 421,000

30 1.8 4.3 106,000 251,000

20 0.7 2.4 21,000 134,000

10 0.1 1.0 8,000 51,000

2 0 0 0 8,000

Exposure to Print

20

What “gap” do we want to close?What do we mean by proficient reading?

• We want students to close the gap and become proficient in reading comprehension.

• “Acquiring meaning from written text” (Gambrell, Block, & Pressley, 2002)

• “the process of extracting and constructing meaning through interaction and involvement with written language” (Sweet & Snow, 2002)

• “thinking guided by print” (Perfetti, 1985)

(Torgesen, 2003)

22

Thinking Guided by Print

• In middle and high school, reading can be increasingly defined as “thinking guided by print.”

23

What works for late elementary and intermediate level students?

• Learning to Read– Decoding and Word Study (“Phonics”)– Fluency

• Reading to Learn– Comprehension and Vocabulary– Content Engagement– Content Enhancement

1. What research questions do you have about grade 3-6 students who are deaf or hard of hearing?

2. What are your students’ instructional needs and learning challenges with reading?

3.  What are your instructional challenges when teaching reading?

4.  What are some things that you do instructionally that seems to work well?

Discussion Guide:

24

Assessment

• Assessment tells you what to teach• Assessment tells you where you’re going• Assessment tells you how to teach and

what instructional adjustments to make so teaching is more effective

• Assessment tells you when you get there• Assessment tells you what to do next• Assessment is instruction

25

Standards-Aligned

• Diagnostic Assessment– Ascertain, prior to instruction, each student’s

strengths, weaknesses, knowledge, and skills. Using diagnostic assessments enables the instructor to remediate students and adjust the curriculum to meet the student’s unique needs.

• Formative Assessment (Progress Monitoring)– Allows teachers to monitor and adjust their

instructional practice in order to meet the individual needs of their students. They key is how the results are used. The results should be used to shape teaching and learning.

26

Diagnostic Assessment

• Individually administered tests designed to determine specific academic areas of strength and weakness (in some cases a detailed error analysis can be provided)

• Instructional levels can be determined• Assist with instructional planning and

educational goals• Assist in determining areas for future

assessment and progress monitoring28

29

Stages of Learning

Entry Level

Initial Acquisition

Advanced Acquisition

ProficiencyMaintenance Generalization

Problem Solving

0-25%

65-80%

(Rivera & Smith, 1997)

Guidelines for Selecting a Diagnostic Assessment

• Aligned with the core content of reading (“Big Ideas” of Reading)

• Provides grade-level and instructional-level information

• Provides diagnostic/instructional profile, error analysis, helps determine what to teach

• Has documented technical adequacy (reliable and valid)

30

Guidelines for Selecting a Diagnostic Assessment• Items are developed from specific

performance objectives directly linked to an instructional domain

• The score is based on an absolute, not a relative standard

• The test measures mastery by using specific standards

• The focus is criterion-referenced evaluation: what the student can do and cannot do on specific skills and knowledge tasks.

31

What it looks like. . .

Items Missed:On the Word Attack subtest, the long a-e patter in nonsense words -gaked, straced; the long I-e pattern in nonsense word -quiles

Deficit Skill:Decoding words with the long vowel-consonant-silent e pattern

Probe:Decoding words orally to teacher -cake, make, snake, rate, lake, fake, like, bike, kite

Criterion:-Decode 10/10 words for mastery-Decode 8/10 words to 6/10 words for instructional level-Decode 5/10 words or less for failure level; assess prerequisite skill level: discrimination of long/short vowels (vowels: a, i) 32

Test Construction Considerations

Must consider. . .

(1) Which specific tasks should be included

(2) How performance should be judged as mastered or not mastered

33

Setting Standards

(1) Judgment of Test Content

(2) Judgment of Individual Test Takers

(3) Assessing Error Rates

34

Judgment of Test Content

Standard setters (judges) analyze test items to determine how many should be passed to reflect minimal proficiency --consider the “borderline” test taker

35

• Based on the guidelines just presented, consider what diagnostic assessment you could administer to your students.– Do you currently use a diagnostic

assessment? If so, which one? Does it meet the criteria we just discussed?

– What are the challenges of using diagnostic assessments with your students?

36

Discussion Guide:

Woodcock Reading Mastery Test-Revised• Purpose:

• Norm-referenced assessment that provides diagnostic information for instructional decision making

• Content:• Letter identification, word identification, word

comprehension (antonyms, synonyms, analogies), general reading vocabulary, science-mathematics vocabulary, social studies vocabulary, humanities vocabulary, passage comprehension

• Evaluation: • BOY October/November• EOY March/April

37

Appropriate for: Grades K-16ages 5 years 0 months through 75

Time – 10-30 minutes for each cluster of individually administered tests

Reliability

Internal reliability • Tests median = .91 (range .68

to .98) • Clusters median = .95 (range .87

to .98) • Total median = .97 (range .86 to .99)

Total Reading Full Scale

ReadinessVisual Auditory

Learning

Letter Identification

Basic Skills

Word identification

Word Attack

ComprehensionWord

comprehensionPassage

comprehension

ReadinessForm G Only

• Visual Auditory learning– A task to determine if the student can

associate symbols with words– Tests memory, attention, grouping of

word parts (i.e., ing with verbs)• Letter identification

– Alphabet recognition– Different fonts– Print and cursive

Basic Skills

• Test 3: Word identification– Reading words– Begins with one word on a page and

advances to multiple words– 106 items in increasing difficulty– The student does not need to know what

any of the words mean– Average score for a kindergarten student

is 1– Average score for a student in 12th grade

is 96

Basic Skills

• Test 4: Word Attack• Reading two types of words

– Nonsense words– Words with very low frequency usage

• Measures the ability to apply phonic and structural analytic skills

• Training is provided so the student will know how to approach the test

Comprehension

• Test 5: Word Comprehension– 3 subtests– Each begins with sample items– Training continues until competes the

item correctly.

• Subtest 5A: Antonyms– Measures ability to read a word and

respond orally with a word opposite in meaning

Comprehension

• Subtest 5B: Synonyms– Comprehension of reading vocabulary– Read a word and state another word

similar in meaning– Synonyms are “a more difficult cognitive

processing task than Antonyms.” p. 7

Comprehension

• Subtest 5C: Analogies– Read a pair of words; – ascertain the relationship, – read the first word of the second pair, – use the same relationship to supply a

word to complete the analogy

• Demonstrates content embedded word knowledge

Word Comprehension Reading Vocabularies

• General reading • Science-mathematics • Social studies• Humanities

Comprehension

• Test 6: Passage Comprehension• Modified cloze procedure

– Short passage with a blank line– Student supplies a word that “fits” in the

blank– The first 1/3 of the passage are one

sentence long and have a picture related to the text

Materials

• Examiner protocol• Test record booklet• Stimulus book/easel pages for

student• Clipboard• Pencil

49

Administration

• Administer:– Test 3: Word Identification– Test 4: Word Attack (if possible)– Test 5A: Word Comprehension (Antonyms subtest)– Test 5B: Word Comprehension (Synonyms

subtest)– Test 5C: Word Comprehension (Analogies Subtest)

• The test battery will take an experienced tester about 45 minutes

• Test by complete pages

• Start at the points indicated in the tables in the test easel

• If the student is correct on the first 6 items, a basal is established.

• If less than 6 are correct, go back a page and administer the whole page.

• Continue to test backwards starting with the first item on a page until the first 6 on a page are correctly answered

Basal Rules

Ceiling Rules

• 6 or more consecutively failed items that end with the last item on a test page.

• See page 22 for an example of basal and ceiling scoring

Word Identification

• MUST know how to pronounce the words in the test (p. 28-29)

• A table of suggested starting points is provided in the easel– If the student does not respond to the first item,

score it 0 and say the word and ask the student to repeat it

• NO OTHER WORDS WILL BE READ TO THE STUDENT

• WRITE what the student said for incorrect responses

• Write comments the student says

Word Attack

• If the student scores 0 or 1 on the word identification, a score of 0 can be recorded for Word Attack – (For our practice, don’t do this)

• Begin with the 2 sample items; then proceed to item 1

• Study the pronunciation guide (p. 28-29)• The student must answer within 5 seconds• The “word” must be read naturally –not

sounded out for the final reading• WRITE what the student says

Word Comprehension

• For all three subtests, the student reads the item aloud and responds orally

• Only single word responses are acceptable

• Mispronunciations are not errors• WRITE what the student says• Begin with the practice item in each

subtest

Scoring

• Score as you administer the test• Score 1 or 0 by the item• Write any comments and erroneous

responses• Raw score is the sum of correct

responses; plus 1 point for every item below the basal

Scoring Word Comprehension

• Antonyms and Synonyms combined score– Calculate the score for each subtest– Add them– Convert this raw score to a part score– Record in the box labeled 5A+5B part score– Covert the Analogies raw score to a part

score– Sum both part scores for a Word

comprehension W score

Reading Vocabularies

• Count the correct responses for the Test 5 subtests

• The designation for each response is coded on the test record– G general reading– SM science and math– SOC social studies– H humanities

Levels of Interpretive Information

• Level 1: Analysis of errors• Level 2: Level of development• Level 3: Quality of Performance• Level 4: Standing in a Group

59

Level 1: Analysis of Errors

• Individual item responses• Descriptive of a subject’s

performance on precisely defined skills

60

Level 2: Level of Development

• Sum of item scores• Raw score• Rasch ability score (test W score,

subtest part score, cluster W score)• Grade equivalent• Age equivalent

61

Age and Grade Calculations

• Age is standard- use the AGS calculator if you wish

• Grade placement is by tenths of the school year– See table in the test protocol or on page

32

Chronological Age

Years Months Days• Date of Test: 2000 11 22• Date of Birth 1992 04 03• Chronological Age 8 7 19

8-8*If the number of days exceeds 15, round up

to the nearest month. If the number of days is 15 or less, round down to the nearest month.

63

Chronological Age

Years Months Days• Date of Test: 2000 02 17• Date of Birth 1990 08 05• Chronological Age 9 6 12

9-6*If the number of days exceeds 15, round up

to the nearest month. If the number of days is 15 or less, round down to the nearest month.

64

1999 14

Chronological Age: Your Turn

Years Months Days• Date of Test: 2000 05 15• Date of Birth 1989 09 24• Chronological Age ______ ___

____*If the number of days exceeds 15, round up

to the nearest month. If the number of days is 15 or less, round down to the nearest month.

65

Raw Scores

The number of items the student has answered correctly or incorrectly on a given test.

Calculation:

(1) Count the number of correct test items

(2) Divide the number of correct items by the total number of test items to obtain the percentage correct

Raw Scores

When to Use:• Raw score is a starting point for all norm-referenced scores

• Only appropriate when comparisons to other students or other (nonalternate form) tests are not needed

• Only way raw score can be used is in reference to criterion or individual-referenced evaluations, not norm-referenced ones

• Raw scores can provide a better basis for interpretation when they are summarized as percentages (e.g., Summarizing a raw score as 90% correct is more informative than stating that a student had 75 items correct).

Raw Scores

Advantages and Disadvantages:• Advantages

• We can express them as the number or percentage correct

• Can be used to measure mastery or improvement

• Disadvantages

• Limited interpretability --it is not possible to use them to compare performance across time, students, tests, or content

Grade- and Age-Equivalent Scores

Grade and age-equivalent scores express the student’s performance developmentally in terms of a corresponding age or grade level

• Usually, age scores are reported in years and months

If Jan, who is 10 years and 3 months old, has an age score of 12-5, her performance theoretically is the same as child who is 12

years and 5 months old.

• Grade scores are reported in grade levels to the nearest tenth, which corresponds to academic months

If Jack, a 4th grader, has a grade equivalent score of 6.1, he is performing at the 6th grade, first month level.

Grade- and Age-Equivalent Scores

When to Use:• The American Psychological Association (APA) and the National Council on Measurement (NCME) have recommended against the use of grade- and age-equivalent scores in making any educational decisions

Grade- and Age-Equivalent Scores

Advantages and Disadvantages:Critical disadvantages outweigh their simplistic advantage:

(1) Both scores, especially grade scores, are based upon the assumption that learning occurs consistently across the year, which has not been proven

(2) We cannot say with accuracy that a 4th grader with a grade score of 6.1 performs like all 6th graders in their first month

(3) Age- and grade-equivalent scores are not measured in equal interval units; therefore they cannot be added or subtracted to examine improvement

(4) Age- and grade-equivalent scores are often derived by extrapolating and interpolating from normative data

Level 3: Quality of Performance

• Performance on a reference task• W-difference score (DIFF);

Instructional range; Relative performance index (RPI)

72

Level 4: Standing in a Group

• Deviation from a reference point in a group

• Rank order• Standard score• Percentile rank

73

Percentiles

•Percentile – point in a distribution at or below which a percent of the observations lie–For example: 10% of observations lie at or below the 10th percentile.

Percentiles

• The term percentile refers to the percentage of individuals in a fixed standardization sample with equal or lower scores

• Percentile rank represents the area of the normal curve, expressed as a percentage, that scores below a certain value

If Sue’s raw score of 13 has a percentile ranking of 85, then 85% of the population upon which the test is based, scored at or below 13; 15% of

the standardization sample scored above 13.

• Percentiles range from 1 to 99, never 0 or 100

• The 50th percentile is equal to the median

• For increased accuracy, percentiles may be reported in decimals, so some test may range from .1 to 99.9.

Standard Scores

• Standard scores represent a linear transformation of raw scores into standard deviation units.

• Standard score reflects the student’s standing relative to others in the distribution on the basis of variation

• Translating raw scores into a set of equal interval, standard scores means that there will always be a consistent mean and standard deviation

• Three types:

(1) Z-Score(2) T-Score(3) Standard Scaled Scores

Standard Scores: Z-Scores

• Z-Score transformation changes raw scores into deviation units, where the group or test mean is equal to 0.0 and the standard deviation is 1.0.

• Z-Score is a measure of the number of standard deviation units away from the test mean

• Important interpretative indices: (a) the sign (+ or -) and (b) the size of the score [the greater the score, the more it is below or above the mean]

x - X ----------

SD

x = raw scoreX = meanSD = standard deviation

Standard Scores: T-Scores

• T-Score transformation takes raw scores and changes them to equal interval units, where the mean is 50 and the standard deviation is 10.

• Once a raw score is converted to a z-score, the teacher multiplies each score by 10 then adds 50.

T = 10z + 50

• Virtually all T-scores are positive since it would take a z-score of less than -5.0 to convert to a T-score less than zero.

Standard Scores:Standard Scaled Scores (SS)

• With standard scaled scores (SS), raw scores are transformed to a scale where the mean is 100 and the standard deviation is 15

• The standard scaled score has become popular on recent tests of intelligence and achievement

SS = 15z + 100

Comparison of raw scores, z-scores, T-scores, and standard scores (Tindal & Marston, 1990, p. 340)

Standard Scores

When to Use:• Can be used to summarize student performance on a range of different measures because they place all measures on a common scale

• If we want to perform arithmetic operations on the measures (e.g., the pretest performance is to be subtracted from the posttest performance), then we must use standard scores

• If we want to identify a student’s real position in the distribution, then we must use standard scores

Standard Scores

• All three types of standard scores are equal interval units so they may be added, subtracted, and arithmetically transformed.

• Can show at a glance how far a student is from the mean and his or her position on a normal distribution

• When scores from different tests are compared to each other, standard scores are scale-free and, therefore, can be directly compared

• Can be used to examine absolute changes in relative performance, we can calculate them by subtracting the pre- from the posttest score

• Can be used to determine growth and can reflect improvements in relative standing (this cannot be done with raw scores)

Profiles and Error Analysis

• Instructional Level Profile• Percentile Rank Profile• Diagnostic Profiles• Word Attack Error Analysis

84

86

Individual-Referenced Evaluation

• Measures of student performance over time and comparison of successive values to previous values

• Progress Monitoring!

87

Components

• Direct measurement• Repeated measurement• Time-series graphic displays• Data utilization and decision rules

88

Interpreting Data- Making Instructional Decisions

• Summarizing Performance• Decision-Making

– Goal-oriented decisions– Intervention-oriented decisions

• Based on the guidelines just presented, consider what individual-referenced evaluation/progress monitoring measures you currently administer to your students (3-6).– What assessments, if any, do you currently use?

• What works/what doesn’t work (e.g., what are some of the challenges with the assessments you use)?

• How do you use the assessment information? What kinds of instructional decisions do you make?

– If you don’t currently use individual-referenced evaluation/progress monitoring measures, what types of progress monitoring measures would be helpful to your instruction?

89

Discussion Guide:

90

MAZE

MAZE

• Multiple choice• Cloze task• 150-400-word grade-level passage• First sentence intact• Thereafter, every 7th word replaced

with 3 words inside parentheses• 2.5 minutes• Group administered

92

Grade Level Level WCPM Errors Materials 1-2 Frustration <40 >4

Instructional 40-60 4 or lessMastery >60 4 or less

3-6 Frustration <70 >6Instructional 70-100 6 or lessMastery >100 6 or less

Target Levels for Passage-Level Placement

93

Materials

• 2 sources– Pro-Ed Macintosh-platform

•Administered, scored, maintained on Macintosh

– Aimsweb•Graded passages available for

purchase•Data storage and graphing services

also available

94

Aimsweb Sample Passage

• Once upon a time there was a merchant whose wife died leaving him with three daughters. The two older daughters were good-looking (but, stand, then) very disagreeable. They cared only for (until, themselves, himself) and for their appearance; they spent (palace, wicked, most) of the time admiring their reflections (in, of, turned) a looking glass.

95

Materials

• Student copy of MAZE probe• Examiner copy of MAZE probe• Stopwatch (time for 3 minutes)• Pencil/Pen• Ruler or marking card to help

children with line tracking (optional)

96

Directions

• Tell students to read the passage, select the correct word from among the 3 options

• Decide if practice test is needed• Time for 3 minutes • Collect the MAZE probes

97

Accommodations

• Retest (different day, different probe, different conditions)

• Alternate setting• Check for understanding of directions• Explicit instruction (comprehension

strategies) + retest• Practice activities, “mini lessons,”

use MAZE as lesson warm-up98

Accommodations

•Enlarge print•Reduce amount of text on page

•Color code and highlight choices

•Ruler99

Have you ever had nothing to do? Sometimes when I have nothing to (do/can/so), I take a walk. That’s when (to/I/do) kick stones. I look for cans (to/bet/kit) kick.

100

Have you ever had nothing to do? Sometimes when I have nothing to (________), do can so

101

Practice: Administration

• Practice administering the MAZE with a partner. Role play the roles of teacher and student. Follow the administration guidelines as you practice.

• Evaluate fidelity by using the “Checking Out Accuracy in Test Administration” checklist.

102

Scoring

• Total number correct in 3 minutes• Total number of errors in 3 minutes• % Accuracy in 3 minutes

( = total correct / total completed )

• Students receive 1 point for each correctly circled answer. Blanks and no circles are counted as errors. Scoring is discontinued if 3 consecutive errors are made. The number of correct answers within 3 minutes is the student score.

103

Kicking Stones

Have you ever had nothing to do? Sometimes when I have nothing to (do/can/so), I take a walk. That’s when (to/I/do) kick stones. I look for cans (to/bet/kit) kick. If I can’t find any (you/cans/pal) to kick, I just kick stones. (He/I/Be) look for big stones to kick. (Be/I/Me) walk down the road kicking one (kiss/stone/from) after another. This means I have (wishful/nothing/pressed) else I can think of dong.

104

Prorating

• Some students may finish all of the items before the 3 minutes is up. Remember to prorate the score.

• See AIMSWeb Scoring Directions

105

Practice: Scoring

• Practice scoring the MAZE. Determine MAZE scores for Rick and Dave. – Total number correct– Total number of errors– % Accuracy

( = total correct / total completed )

106

Practice: Alternate Scoring

• Practice scoring the MAZE. Determine Juan’s MAZE score using the 3-consecuitve errors approach.– Total correct (w/scoring discontinued

prior to 3 consecutive errors):

107

Practice: Alternate Scoring

• ANSWER– Juan circled 16 correct answers in 2.5

minutes. He circled 7 incorrect answers. However, Juan did make 3 consecutive mistakes, and 5 of this correct answers were after his 3 consecutive mistakes. Juan’s score on the MAZE would be 10.

108

MAZE Benchmarks

• Typical goal: 0.4 correct choices per week

• Ambitious goal: 0.7 correct choices per week– (~18 words in 25 weeks)

• Analyze data after– At least 6 scores and/or at least 3 weeks

• Make instructional change if– Most recent 4 scores are all below goal line

(Fuchs, Fuchs, Hamlett, Walz, & Germann, 1993) 109

MAZE Benchmarks

Grade• First Grade• Second Grade• Third Grade• Fourth Grade• Fifth Grade• Sixth Grade

Spring Benchmark (at or above 85% accuracy)

• 8 correct• 8 correct• 13 to 15• 20 correct replacements per 2.5

minutes• 25 correct replacements per 2.5

minutes• 30 correct replacements per 2.5

minutes

110

Guidelines and Decision Rules

• Move up to the next instructional level on the MAZE when students receive a score at or above the target benchmark on 3 consecutive probes. – NOTE: Use instructional judgment. If students

are not getting 85% or above accuracy, continue until errors are reduced.

111

Guidelines and Decision Rules

• Administer 2 x per month if the MAZE is the only comprehension progress monitoring measure used

• Administer 1 x per month if another comprehension monitoring measure is used along with the MAZE (e.g., www.easycbm.com; multiple choice comprehension)

112

Look at the Data!

MAZE - Second Grade

02468

10121416

1 2 3 4 5 6

Assessment

Sc

ore

Student 1

Student 2

Student 3

113

Some Difficulties Revealed Through Assessment

Decoding Word Type/Use Vocabulary Identifying the main idea Distinguishing between relevant & extraneous

information Following the sequence of information Summarizing information presented Making inferences/applying concepts presented

beyond the scope of a particular passage

114

Linking Assessment and Instruction: Error Analysis

115

Practice: Linking Assessment and Instruction

• Use the MAZE Error Analysis to determine if there are any error patterns with Juan’s MAZE.– What, if any, error patterns do you note?– Based on your analysis, what

instructional intervention could be used to help Juan improve his reading comprehension?

116

Practice: Linking Assessment and Instruction

• Look at the MAZE scores of two third-graders, a typically performing student, and a low performing student. What do you observe?– What differences did you observe in Emma’s

and Abby’s MAZE performance?– What other conclusions can you draw? – What interventions would help improve

Emma’s and Abby’s performance? Identify the intervention and the strategy/skill the intervention will address.

117

118

Multiple Choice

Reading Comprehension

http://www.easycbm.com

Materials

• Student copy of probe– http://www.easycbm.com/

• Students can also take online• Untimed

120

121

122

Directions

• Please read the story and answer the questions that come after it.

123

Getting Passages

124

Entering a Student Score

125

Entering a Student Score

126

Viewing a Score Report

127

Viewing a Score Report

128

Viewing a Score Report

129

Sample Score Report

130

Sample Score Report

131

Scoring

• Total number correct

132

Multiple Choice Comprehension Benchmarks

Grade• Second Grade• Third Grade• Fourth Grade• Fifth Grade• Sixth Grade

Spring Benchmark

• 10• 15• 15• 16• 16

133

Guidelines and Decision Rules

• Use 1 x per month (MAZE is also used 1 x per month)

• Move up to the next instructional level when students receive a score at or above the target benchmark on 3 consecutive probes.

134

Practice: Linking Assessment and Instruction

• Look at the Multiple Choice Comprehension materials. How could you use the information from the assessment for instructional decision making. What are some ways that you could analyze student errors?

135

Error Analysis

136

137

Depth ofVocabulary Knowledge

Progress Monitoring

• DIBELS – LNF, WIF, OR MAZE /other comprehension– LNF: PreK-K– WIF: Grade 1– Maze: Grade 1-Grade 3 /other

comprehension• AND Depth of Knowledge Vocabulary (DOK)

138

Vocabulary Depth of Knowledge (DOK) Assessment

• Purpose:– Assess depth of vocabulary knowledge

• Definition Knowledge– No/Faulty Knowledge– Developing Knowledge– Accurate Knowledge

• Use Knowledge– No/Faulty Use– Basic Use– Complex Use

– Use diagnostically for instructional decision making

139

Importance of Vocabulary

• “Our knowledge of words. . .determines how we understand texts, define ourselves for others, and define the way we see the world.” – Stahl (1999)

• Expressive vocabulary at the end of first grade is a significant predictor of reading comprehension ten years later (Cunningham & Stanovich, 1997)

140

What the research says:

• The relationship between reading comprehension and vocabulary knowledge is strong and unequivocal (Baumann & Kame’enui, 1991; Stanovich, 1986).

• Even weak readers’ vocabulary knowledge is strongly correlated with the amount of reading they do (Cunningham & Stanovich, 1998).

141

Materials

• Student NO PROBE!• Examiner copy of DOK

directions and probe• Scoring Rubric• Probe Definition Sheet• Pencil/Pen• Stopwatch/Timer• Clipboard (optional)• Administration Partner for Co-Administration

(optional)

142

Sample Probe

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

143

When Do You Administer the DOK?

• Depends on your instructional purpose!• For progress monitoring, recommended

1x per month (minimum)– 2 x per month ideal

• Or, use as a pretest and posttest to assessment student knowledge of vocabulary from an instructional unit or series of lessons

144

145

0 2 4 6 8 12 14 16 181020

30

40

50

60

70

80

90

100

Successive Days

Test Score

Example: Variability

145

Word Lists: K-1 and 2-3

• Probes (K-1 and 2-3) provided by the PaTTAN D/HH Promising Practices Project are based on words from the DIBELS word use fluency.– Words for each probe were randomly selected

from the pool of DIBELS word use fluency words.– Expert consultants with extensive experience

working with students who are deaf or hard of hearing, identified the final words used on the probes (e.g., considerations were made for signing, multiple use, etc.)

146

Word Lists: 3-4 and 5-6

• Probes (3-4 and 5-6) provided by the PaTTAN D/HH Promising Practices Project are based on words from A. Biemiller’s Words Worth Teaching (2010).– Words for each probe were randomly selected from

the pool of “Easy” words (4 words) and “Words Worth Teaching” words (6 words).

– Expert consultants with extensive experience working with students who are deaf or hard of hearing, identified the final words used on the probes (e.g., considerations were made for signing, multiple use, etc.)

147

Maintain High Expectations When Thinking About Word Selection• Consider sources like. . .

– A. Biemiller (2010). Words Worth Teaching: Closing the Vocabulary Gap. Columbus, OH: McGraw Hill/SRA.

– SAS (Pennsylvania's Standard-Aligned System)

– Your reading program, school/district curriculum, or instructional unit

148

Word Selection Guidelines

• Select taught and untaught• Select words that are critical to comprehension.• Select words that students are likely to encounter in

the future and are generally useful. • Select words that will give students a more precise

understanding of something they already know (e.g., plain-ordinary; rested-refreshed; shiny-glisten)

• Use the Goldilocks Approach (Stahl & Stahl, 2004): – Not too easy, not too difficult. . .just right.– Sometimes “just right” can be hard and challenging!

149

Directions

Sign or say these specific directions to the student:“I’m going to ask you about some words. I’ll ask you to

tell me what each word means, then I’ll ask you to use the word in a sentence. For example, if I say ‘what does sad mean?’ you could say, ‘Sad is when you are not happy.’ If I say ‘use the word sad in a sentence’ you could say, ‘I was sad when my ice-cream fell on the floor.’”

“Now it’s your turn. What is a chair?”

CORRECT RESPONSE:If student gives a correct response, say:

INCORRECT RESPONSE:If student does not respond or gives an incorrect definition, say:

“Very good.” “A chair is something you sit in.”

150

Directions

• “Now use the word ‘chair’.”

• “If you don’t know what a word means, or how to use a word, it is OK to say, ‘I don’t know.’”

• “OK. Here is your first word.” [Time for 15 minutes]• For each item, say “What does ______ mean?” or “What is a _____?”

After the student responds, say “Now use the word _____ .”

CORRECT RESPONSE:If student gives a correct response, say:

INCORRECT RESPONSE:If student does not respond or uses the word incorrectly, say:

“Very good.” “I sat in my chair all day at school.”

151

Directions

3. During directions, you may also give additional examples to help explain definition and use. Do not take more than 5-minutes to present additional examples.

4. Give the student the first word and start your stopwatch.

5. Minor Prompts/Major Prompts

152

Major Prompts

• Definition: “Remember, tell me what the word _______ means. Describe the word the best that you can.”

• Use: “Now use the word. Use the word in a sentence or try to use the word as you communicate.”

• Use: “Put the word into your sentence, like this [visual]. Use the word, _______, with some of your own words.”* You may use a major prompt for both definition and use 2x during the assessment (e.g., 2 total major prompts for definition and 2 total major prompts for use across all items).

153

Minor Prompts

• Definition: “What does ____ mean?”• Definition: “Now use the word______.”• Use: “Now use the word ______.”

There is no limit on minor prompts. Use as many as needed per item.

154

Sample Probe

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

155

D: student response [D-mp1]U: student response

D: student response [D-mp2]U: student response

D: student responseU: student responseD: student responseU: student responseD: student responseU: student response [U: mp1]

D: student responseU: student response

D: student responseU: student responseD: student responseU: student response

D: student responseU: student response

Directions

6. Provide the next word promptly or when the student hesitates or pauses for 5 seconds.

7. Record the student’s response in the space provided. Use “NR” for “no response,” and “DK” for “I don’t know.”

8. For each word, it’s okay to elicit (sign) multiple meanings. If the word has several meanings, try to prompt for just two to three alternate meanings.

9. If the student gives a partial or ambiguous definition, follow-up by signing or saying, “Tell me more about what _____ means” or “Tell me more about a ______.”

156

Follow-up PromptsFollow-up prompts may be used once to

clarify the definition and use for each item.

157

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

D: student response [D-mp1] FU: student response F student response

Directions

10. Discontinue Rule: If a student has not given any correct response for the first 4 words, discontinue the task. A student would score a 0 for the assessment.

11.Encourage response with a neutral praise (Examples: I like how hard you are working. Good job with your thinking.). If the student becomes frustrated, tell them it’s okay if they don’t know all of the words.

12. If a student acts out or dramatizes the word (e.g., acts out or dramatizes a word like “snore” without signing,) prompt the student by saying, “Tell me what ______ means using words/sign. (If the student is still not able to provide the definition in words, write “acted out” on the score sheet.)

13. If the student begins to ramble or becomes off task, redirect the student back to the task.

158

Practice: Administration

• Practice administering the DOK with a partner. Role play the roles of teacher and student. Follow the administration guidelines as you practice.

159

Scoring

Response Define Use Total

1 excite

2 talk

3 head

4 trip

5 peaceful

6 daughter

7 pay

8 white

9 pants

160

Scoring

• Definition– No/Faulty Knowledge = 0

points– Developing Knowledge = 1

point– Accurate Knowledge = 2

points

• Use– No/Faculty Use = 0 points– Basic Use = 1 points– Complex Use = 2 points

161

Alternate Definitions and Use

• Definitions– If a student provides more

than one definition of a word, score each alternate definition.

– For example, if a student defines “fly” as an insect and as “an aircraft flies,” score each of the definitions using the No/Faulty Knowledge-Developing Knowledge-Accurate Knowledge criteria.

– Add BOTH definition scores for the word’s overall Define Score.

• Use– If a student provides more than

one contextual use of a word, score each alternate use.

– For example, if a student uses “fly” as an insect and “an aircraft flies” in context (e.g., two different sentences) score each of the uses using the No/Faulty Use-Basic Use-Complex Use criteria.

– Add BOTH use scores for the word’s overall Use Score.

162

Examples

• Target word: “coat”– (1) A piece of clothing with long sleeves which

you wear over your clothes when you go outside (2) An outer covering of an animal (3) A thin layer of a substance (coat of paint)

• Student response:– “A coat is a jacket. I have a really thick coat that

I only wear when it’s really cold.”• Definition Score:• Use Score:• Total Score:

1

23 163

Examples

• Target Word: “proud”– (1) If you feel proud, you feel pleased

about something good you have done, or about something good that a person close to you has done (2) Someone who is proud who ahs respect for themselves (3) Someone who is proud feels that they are better or more important than other people

164

Examples

• Student response:– “Proud is when you feel good. I am proud.”

• Definition Score:• Use Score:• Total Score:

1

1

2

165

Examples

• Student response:– “Proud is when you feel really good about

something you did. I felt proud when I won the race.”• Definition Score:• Use Score:• Total Score:

2

2

4

166

Scoring Tips

Word: Coat

167

Define

Use

0 point responses 1 point responses 2 point responses

0 point responses 1 point responses 2 point responses

Practice: Scoring

• Practice scoring the DOK.

168

Guidelines and Decision Rules

• Purpose:– Diagnostic instructional decision making,

progress monitoring

• Content:– Depth of vocabulary and word knowledge

• Implemented:– 1 x per month for progress monitoring

• Also okay to use 2 x per month for progress monitoring and/or as a pretest-posttest for instructional units.

169

Look at the Data!

DOK - Total by Student - Kindergarten (2009)

0

5

10

15

20

25

30

35

Nov Dec Jan Feb

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

170

Look at the Data!

DOK - Total by Student - Kindergarten (2009)

0

5

10

15

20

25

30

35

Nov Dec Jan Feb

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

171

Look at the Data!

DOK - Total by Student - Kindergarten (2009)

0

5

10

15

20

25

30

35

Nov Dec Jan Feb

Student 1

Student 2

Student 3

Student 4

Student 5

Student 6

Student 7

172

Linking Assessment andInstruction

• Defining Knowledge– None– Developing– Accurate

• Use (Contextual Knowledge)– None– Basic– Complete

• Define vs. Use• Alternate Understanding and Use (Multiple

Meanings)• Fluency

– Timed vs. Un-timed• Word Types

– Nouns, verbs, etc.– Levels

173

Practice: Linking Assessment and Instruction

• Look at the sample DOK probes for the two students in Grades 2-3. The probe has the words reader, afraid, lady, remember, nut, etc.

• What instructional decisions can you make based on the students’ DOK responses? What will you teach?

• Do students seem to have a more challenging time with word use, definitions, both? How will this inform your instruction?

174

Conclusions

• Assessment tells you what to teach• Assessment tells you where you’re going• Assessment tells you how to teach and

what instructional adjustments to make so teaching is more effective

• Assessment tells you when you get there• Assessment tells you what to do next• Assessment is instruction

175

• What are the most common reading difficulties that you observe with your students? What are some of the biggest challenges you have with reading instruction?

• What questions, if any, do you have about instructional interventions for students in grades 3-6? What areas of instruction would be most helpful to you?

• Do you currently use any instructional strategies or interventions that work well? If yes, what are some of these strategies/interventions?

177

Discussion Guide:

Assignment

• Use the assessments discussed today with your students. Assessment should include:– Diagnostic Assessment (your choice)

• Administer 1x in the fall

– Reading Progress Monitoring Assessment• Administer 2x per month

– Depth of Vocabulary Knowledge Assessment• Administer 1 x per month

• Bring your student data to the next training.• See you on November 18th for our Instructional

Interventions training!

178

Any Questions?

• Lana Edwards Santoro, Ph.D.Research Associate and Project Research Consultant

University of Oregon, Center for Teaching and Learning

Alexandria, VA 22310(703) 971-0310lana.santoro@earthlink.netlsantoro@pacificir.org

(NOTE: email will be changing to “uoregon” account)

179

Contact Information www.pattan.net

Marlene Schechtermschechter@pattanpgh.net

1-800-446-5607; X6862

Sue Ann Housershouser@pattan.net1-800-441-3215; X7243

Jane Wattsjwatts@pattan.net1-800-360-7282;

X3106

Commonwealth of PennsylvaniaEdward G. Rendell, Governor

Pennsylvania Department of EducationGerald L. Zahorchak, D.Ed., Secretary

Diane Castelbuono, Deputy SecretaryOffice of Elementary and Secondary

Education

John J. Tommasini, DirectorBureau of Special Education

Patricia Hozella, Assistant DirectorBureau of Special Education 180

Edward G. Rendell Gerald L. Zahorchak, D.Ed. Governor Secretary

Diane Castelbuono, Deputy SecretaryOffice of Elementary and Secondary Education

John J. Tommasini, DirectorBureau of Special Education

Bureau of Special EducationPennsylvania Training and Technical Assistance Network

Contact Information: Name of Consultant, Email addresswww.pattan.net