Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress...

56
Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st , 2007 Amanda Albertson, M. A. Courtney LeClair, M. A. Stephanie Schmitz, Ed.S.
  • date post

    22-Dec-2015
  • Category

    Documents

  • view

    218
  • download

    0

Transcript of Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress...

Page 1: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring

Summer RtI InstituteJuly 30-31st, 2007

Amanda Albertson, M. A. Courtney LeClair, M. A.

Stephanie Schmitz, Ed.S.

Page 2: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Agenda

Assessment Curriculum Based

Measurement

Norming Uses Strengths & Limitations Procedures and Tips

Screening Choosing a measure Procedures and Tips Decisions

Progress Monitoring Procedures Data examples Decisions

RtI and Special Education Placement

Page 3: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Direct Assessment of Academic Skills

Curriculum-Based Measurement (CBM)

Contents of the assessment are based on the instructional curriculum.

Measures are presented in a standardized format.

Material for assessment is controlled for difficulty by grade levels.

Measures are generally brief.

Shapiro, E. S. (2004). Academic skills problems: Direct assessment and intervention (3rd ed.). New York: The Guilford Press.

Page 4: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Curriculum Based Measurement (cont.) Advantages

Can be used efficiently by teachers

Produces accurate, meaningful information to index growth

Answers questions about the effectiveness of programs in producing academic growth

Provides information to help teachers plan better instructional programs

Fuchs, L. & S. Fuchs, D. (1997) Use of curriculum-based measurement in identifying students with disabilities. Focus on Exceptional Children, 30, 3, 1-15.

Page 5: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Norming (a.k.a. Obtaining Normative Data)

Page 6: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Normative Data

“Provide information on student levels and range of performance at different grades, by indexing achievement cross-sectionally”

Provide “appropriate standards for weekly rates of academic growth”

Fuchs, L. and Fuchs, D. (1993). Formative Evaluation of Academic Progress: How much growth can we expect?. School Psychology Review 22, 1, 1-30.

Page 7: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Uses of Local Normative Data

Make decisions about referred students Report individual and/or group scores to

teachers, parents, or other agencies Identify students proactively who aren’t

keeping up with peers or benchmarks Detect academic and behavioral trends

over time

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 8: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Strengths of Local Normative Data

Decrease the likelihood of bias in decision making Provide meaningful comparison group Promote identification of educational needs in

a systematic problem-solving orientation Follow changing patterns of local performance

Clear expectations of what is expected and ranges in performance

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 9: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Limitations of Local Normative Data

Threat of Misinterpretation Sample & measurement tasks must be defined Small sample can cause the norms to be unstable Local performance is not necessarily acceptable

May use empirically derived benchmark rates to determine if students’ performance is acceptable

Local norms may not necessarily advocate the use of certain curricula

Norms show level of performance and rate of growth in curricula

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 10: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Steps in Developing Local Norms

1. Identify norm sample

2. Choose materials

3. Decide who and how many students will be assessed

4. Collect the data

5. Organize the data for use

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 11: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

1. Identify norm sample 3 Basic Levels

Classroom School-Building School-District

Consider… Decisions for which data shall be used Amount of curriculum chaos in the district Political and economic structure of the area Characteristics of the population Economic and other resources available

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 12: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

2. Choosing Norming Measurement Tools

Tools should… Be reliable Be accurate Have relatively normal distributions Be sensitive to change Provide enough opportunities to respond (limit ceiling

effects) Have standardized administration and scoring Reliably differentiate student level of skill Be time efficient Be affordable Provide data important to general education expectations

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 13: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Examples of Norming Measurement Tools Dynamic Indicators of Basic Early Literacy Skills

(DIBELS; http://dibels.uoregon.edu/) Reading

K-6 Spanish and English

Aimsweb (www.aimsweb.com) Reading

Spanish and English Math Written Expression K-8

Page 14: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

3. Implement a Sampling Plan

Balance the resources available, representativeness of the sample, and the information desired Some questions can be answered without

testing every child every year Your questions should drive the sampling

plan!

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 15: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Implement a Sampling Plan

Classroom Norms Minimum of 7-10 students

Selected randomly (every nth student on list) Selected randomly from a pool of “typical students”

Building Norms Minimum of 15-20% of students in each grade Minimum of 20 students per grade

Selected randomly To compute percentile ranks, a minimum of 100 students per

grade is needed District Norms

Random sample of 100 students per grade

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 16: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

National vs. Local Norms

National norms require less time and effort Don’t have to collect normative data

National norms are readily accessible Local norms are more representative of your

population Local norms are more sensitive Local norms allow you to choose the

materials that are most appropriate to your building/district

Page 17: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

4. Collect the Data

Trimester norming (Fall, Winter, Spring) Use equivalent but not identical materials

each time Prepare student and examiner materials

ahead of time Examiners should be trained to administer

and score Determine suitable locations for testing Determine appropriate dates for testingBollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in

developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 18: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

5. Organize Data for Use

Data Can be Summarized at Four Levels Individual student raw scores Classroom ranges of scores, medians, and

rank orderings Building ranges of scores, medians, rank

orderings, and percentile ranks District ranges of scores, descriptive statistics,

within grade frequency distributions, percentile ranks, and across grade comparisons

Bollman, K. & Johnson, C. Used with permission from FSDS.org. Based on Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Page 19: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Computing Percentile Ranks

1. Construct a frequency distribution of the raw scores

2. For a given raw score, determine the cumulative frequency for all scores lower than the score of interest

3. Add half the frequency for the score of interest to the cumulative frequency value determined in Step 2

4. Divide the total by N, the number of examinees in the norm group and multiply by 100%

Crocker, L., & Algina, A. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart and Winston.

Page 20: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Organizing Data for Use

Page 21: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Universal Screening

Page 22: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Universal Screening

A classroom-wide, school-wide, or district-wide assessment which involves assessing all students to identify students who are at risk for academic failure or behavioral difficulties and could potentially benefit from specific instruction or intervention.

National Association of State Directors of Special Education, Inc. (2005). Response to Intervention: Policy considerations and implementation. New York, NY: The Guilford Press.

Page 23: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Choosing a Screening Measure

Compatibility with local service delivery needs

Alignment with constructs of interest

Theoretical and empirical support

Population fit

Practical to administer

Glover, T. A., & Albers, C. A. (in press). Considerations for evaluating universal screening assessments. Journal of School Psychology.

Page 24: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Choosing a Screening Measure

Appropriately standardized for use with the target population

Consistent in measurement

Accurate in its identification of individuals at risk

Page 25: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Examples of Screening Measures

CBM Dynamic Indicators of Basic Early Literacy

Skills (DIBELS; http://dibels.uoregon.edu/) Aimsweb (www.aimsweb.com)

Teacher recommendations Classroom assessments National assessments (e.g., MAT) Report card rubrics

Page 26: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Pre-Screening Procedures with CBM

1. Decide who will conduct the screening.

2. Ensure that the individuals who are administering the screening have been trained in using the chosen CBM materials.

3. Organize CBM materials (e.g., make sure there are enough, write student names on them, etc.).

4. Decide whether to use local or national (published) norms to determine which students need additional academic assistance.

5. Ensure that you give the type of probe recommended for that specific grade level and time of year

Page 27: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Possible DIBELS probes

Example of DIBELS chart

Page 28: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

CBM Screening Tips

Reading measures need to be administered individually. It is best to have several administrators and to bring entire classrooms into a central location at one time.

Math and writing can be administered to students as a group, so administer these probes to entire classrooms.

It is also helpful to prepare materials so that each student has their own materials with their names on them.

Page 29: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Post-Screening Procedures

1. Enter student scores into a computer program (e.g., Excel) that can easily sort the data.

2. Sort the data so that students are rank-ordered.

3. Determine which students fell below the previously specified cut-off

Page 30: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Example Spreadsheet

  Median ORF

Student A 3

Student B 5

Student C 6

Student D 8

Student E 8

Student F 9

Student G 11

Student H 11

Student I 13

Student J 14

Student K 15

Student L 16

Student M 17

Student N 18

Student O 19

Page 31: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Screening Results Example

02468

101214161820

Stu

dent

A

Stu

dent

B

Stu

dent

C

Stu

dent

D

Stu

dent

E

Stu

dent

F

Stu

dent

G

Stu

deng

H

Stu

dent

I

Stu

dent

J

Stu

dent

K

Stu

dent

L

Stu

dent

M

Stu

dent

N

Stu

dent

O

Student

Sco

re

Page 32: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Screening Decisions

Students who fall below pre-specified cutoff

Based on scores, supporting documentation, and prior knowledge of student abilities, determine the necessary educational intervention.

Decide who is going to implement the intervention(s).

Decide who is going to monitor student progress over time.

Page 33: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring

Page 34: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring

The practice of assessing students to determine if academic or behavioral interventions are producing desired effects.

Provides critical information about student progress that is used to ensure the use of effective educational practices and to verify that students are progressing at an adequate rate.

National Association of State Directors of Special Education, Inc. (2005). Response to Intervention: Policy considerations and implementation. New York, NY: The Guilford Press.

Page 35: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring

Those students who did not make the screening cutoff will be monitored on a frequent (generally once per week) basis.

It is recommended that the same form of CBM be used for screening and progress monitoring.

Use the recommended form for the students grade and time of year.

Page 36: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring

Typically occurs at least once per week

Provides ongoing information regarding student progress

Can be used to determine whether interventions need to be strengthened or modified

Page 37: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Procedures

1. Based upon the norms you have decided to use and each student’s screening results, set a goal for each student.

This goal should reflect an average gain per week as determined by the norms that you are using.

2. Once the student’s intervention has begun, monitor the student’s progress once per week.

Page 38: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Procedures (cont.) 3. Graph the student’s scores (e.g., correct read

words/minute, correct writing sequences, digits correct) on a chart.

4. Periodically review the chart to determine whether progress is being made.

5. After the student has been in an intervention for a specified amount of time, hold a meeting with your decision making team. Look at the level, and the rate of progress Determine whether the goal was attained and/or exit

criteria met

Page 39: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring: Example 1

Baseline Intervention

CBM

0

5

10

15

20

25

30

35

1 2 3 4 5 6 7 8 9 10 11

Session

Sc

ore

BaselineIntervention

Page 40: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Decisions (Example 1) What you can do in this situation

Continue with the intervention and monitoring.

Continue with the intervention and monitor less frequently.

Discontinue intervention but monitor to ensure that progress doesn’t cease/reverse.

Page 41: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Example 2

BaselineIntervention

0

5

10

15

20

25

30

1 2 3 4 5 6 7 8 9 10 11

Session

Sc

ore

Baseline

Intervention

Page 42: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Decisions: Example 2 Decision that needs to be made in this

situation:

1.Modify the current intervention, or

2. Implement a different intervention in place of the current intervention.

Page 43: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Examples

In example 1, adequate rate and level were being achieved

The team will decide whether or not to continue to monitor student progress.

The student will still be involved in universal

screenings.

Page 44: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring Examples

In example 2, neither adequate rate nor level were being achieved.

It is necessary to modify the current intervention or introduce a new intervention.

Progress monitoring is still necessary.

Page 45: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring: Example 2

Establish a new goal based on the last three data points obtained by the student.

After the intervention is modified or a new intervention is implemented, progress monitoring continues until the next evaluation period.

Page 46: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring: Example 2a

0

5

10

15

20

25

30

35

40

1 3 5 7 9 11 13 15 17 19

Session

Sc

ore

BaselineIntervention 1

Intervention 2

Page 47: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring: Example 2a

What you can do in this situation:

Continue with the intervention and monitoring

Continue with the intervention and monitor less frequently

Discontinue intervention but monitor to ensure that progress doesn’t decrease

Page 48: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring: Example 2b

0

5

10

15

20

25

30

1 3 5 7 9 11 13 15 17 19

Session

Sc

ore

Page 49: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Progress Monitoring: Example 2b

After two periods of intensive, empirically based intervention in which the student has not achieved the level and rate goal established from baseline data, the team should consider special education placement.

Page 50: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

RtI and Special Education Placement

Page 51: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

RtI Is Not a Special Education Initiative! Assessment is conducted within a RtI

framework first and foremost to improve instruction and enhance student growth.

RtI is NOT a stand alone special education initiative, a means for increasing or decreasing special education numbers, or focused primarily on disability determination and documented through a checklist.

RtI is about determining the intensity of support needed to help students succeed!

Nebraska Department of Education. (2006). Technical Assistance Document

Page 52: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Special Education Placement

Before placing a student in special education using the RtI model, several factors need to be considered: 1. Was the measurement of progress accurate? 2. Was the intervention appropriate for the child? 3. Were high rates of treatment integrity observed? 4. Did the student attend sessions regularly? 5. Does the student’s ELL status or other

cultural/language factors need to be considered? 6. Is there evidence that the student could benefit from

special education?

Page 53: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

What About IQ Tests?

The Individuals with Disabilities Act, 2004 became effective October 13, 2006

It states that the severe discrepancy approach “shall not be required” to identify students with specific learning disabilities

“When determining whether a child has a specific learning disability as defined under this Act, the local education agency shall not be required to take into consideration whether a child has a severe discrepancy between achievement and intellectual ability…”

Page 54: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

IDEA 2004 Continues…

“In determining whether a child has a specific learning disability, a local educational agency may use a process which determines if a child responds to scientific, research-based intervention.”

Thus, IQ tests are an option, but not necessary, for LD verification

IQ tests are still necessary for MH verification For more information, see:

http://idea.ed.gov/explore/home

Page 55: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Conclusions

Norming, universal screening and progress monitoring are important components of the RtI process.

Each process is used to ensure that students receive the services that they need to increase performance.

Page 56: Putting the “R” in RtI: Assessing Student Responsiveness through Norming, Screening and Progress Monitoring Summer RtI Institute July 30-31 st, 2007 Amanda.

Additional Resources/ReferencesBollman, K. & Johnson, C. Used with permission from FSDS.org. Based on

Stewart, L. H., & Kaminski, R. (2002). Best practices in developing local norms for academic problem-solving. In A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology IV (pp. 737-752). Bethesda, MD: NASP.

Crocker, L., & Algina, A. (1986). Introduction to classical and modern test theory. New York: Holt, Rinehart and Winston.

Edformation. (2004). AIMSweb, retrieved from www.edformation.com/.Glover, T. A., & Albers, C. A. (2007). Considerations for evaluating universal

screening assessments. Journal of School Psychology, 45, 117-135.Good, R. H. & Kaminski, R. A. (Eds.). (2002). Dynamic Indicators of Basic Early

Literacy Skills (6th ed.). Eugene, OR: Institute for the Development of Educational Achievement. Retrieved from dibels.uoregon.edu/

Fuchs, L. and Fuchs, D. (1993). Formative Evaluation of Academic Progress: How much growth can we expect?. School Psychology Review 22, 1, 1-30.

Fuchs, L. & S. Fuchs, D. (1997) Use of curriculum-based measurement in identifying students with disabilities. Focus on Exceptional Children, 30, 3, 1-15.

National Association of State Directors of Special Education, Inc. (2005). Response to Intervention: Policy considerations and implementation. New York, NY: The Guilford Press.

Nebraska Department of Education. (2006). Technical Assistance Document.Shapiro, E. S. (2004). Academic skills problems: Direct assessment and

intervention (3rd ed.). New York: The Guilford Press.