Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP...

127
Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th , 2011 Kevin Stockslager Kelly Justice Beth Hardcastle

Transcript of Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP...

Page 1: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Systematic Evaluation Model to Ensure the Integrity of MTSS

Implementation in Florida2011 FASP Conference

November 4th, 2011

Kevin StockslagerKelly Justice

Beth Hardcastle

Page 2: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Advanced Organizer

• Accountability and Evaluation

• MTSS and Program Evaluation in the Schools

• Example of an MTSS Evaluation Model

• Review of Potential Data Sources• Surveys• Self-Assessments• Permanent Product Reviews

Page 3: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

PS/RtI vs. MTSS

Page 4: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Accountability and Evaluation

Page 5: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

What does…

• Accountability mean to you?

• Evaluation mean to you?

Page 6: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Accountability in Florida

• Increasing accountability focus the last decade

• Examples include:• School grading• AYP• Special education rules• DA• FEAPs & Teacher evaluation systems

Page 7: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Impact of Accountability

Criticisms

• Lack of educator involvement

• Controversy

• Consequence driven

• Compliance driven

• Conflicting requirements

• Duck and cover approach

Positives

• Establishes and maintains standards for performance

• Reinforces use of data to monitor student outcomes

• Reinforces need to examine resource use

• Student outcome rather than process focus

• Success stories

Page 8: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

(Hall & Hord)

Page 9: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Accountability & Evaluation Issues

• Compliance driven versus informative evaluation• Evaluation often done to meet accountability

requirements• Evaluation can serve to help integrate and improve

school and district services

• Evaluation is fundamental to MTSS

• MTSS has the potential to:• Be viewed as one more thing we have to do OR• Help address accountability & evaluation demands

through the multi-tier framework

Page 10: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

MTSS and program evaluation in the

schools

Page 11: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Important MTSS Evaluation Issues

• Stakeholders should be involved in all aspects of planning and carrying out the evaluation process as well as in decision-making

• Goals through planning should drive the process

• Information obtained to:• Determine where you currently are (needs)• Take ongoing looks at how things are working• Make decisions about what to keep doing and

what to change or eliminate

Page 12: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

MTSS Evaluation Issues cont.

• The data you collect should be driven by the evaluation questions you want to answer• Are students meeting expectations? Academically?

Behaviorally? Social-emotionally?• Are we implementing MTSS with fidelity?• Do we have the capacity to implement

successfully?• Do staff buy into implementing MTSS?

*Example questions

Page 13: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Table Top Activity

• Brainstorm and discuss some additional evaluation questions that you might want to answer at your schools • (2-3 minutes then report out)

Page 14: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

How Are Students Performing?

Examples of data sources

• Academics• FCAT• FAIR• Core K-12• End of Course Exams

• Behavior• Attendance• Tardies• Suspensions• Discipline referrals

• Global Outcomes• Graduation Rates

Page 15: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.
Page 16: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.
Page 17: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Are Schools Implementing MTSS with Fidelity?

Examples of data sources

• Curriculum and Instruction/Intervention• Principal walkthroughs• Lesson plans• Intervention Documentation Worksheets

• Components of MTSS and Data-Based Problem-Solving*• BOQ, PIC, BAT• SAPSI, Tier I & II CCCs, Tier III CCCs

* See http://flpbs.fmhi.usf.edu/ and http://floridarti.usf.edu for more information

Page 18: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.
Page 19: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Ro

les

Rep

rese

nte

d

Pro

ble

m Id

enti

fica

tio

n

Pro

ble

m A

nal

ysis

Inte

rven

tio

n D

evel

o...

Pro

gra

m E

valu

atio

n/..

.0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0.680.76

0.54

0.28

0.64

Tiers I & II Observation Checklist

Percent Present

Roles Present and Problem-Solving Steps

Per

cen

tag

e o

f R

ole

s/C

om

po

nen

ts P

rese

nt

Page 20: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Do We Have the Capacity to Implement MTSS with Fidelity?

Examples of data sources

• Leadership Team structure and functioning• Organizational charts• Minutes/meeting summaries• SAPSI, BOQ, PIC

• Staff knowledge and skills• FEAPs & teacher evaluation system• Staff development evaluations• Work samples

• Resources allocated to match needs• SIP, DIP• Master calendar/schedule• School rosters• Resource maps

Page 21: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Do Staff Buy Into Implementing MTSS?

Examples of data sources

• Leadership vision and commitment• SAPSI, BOQ, PIC• Required and non-required plans

• Staff buy in• SAPSI, BOQ, PIC• District/school staff and climate surveys• Dialogue• Brief interviews with key personnel

Page 22: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

District commitment SBLT support Faculty involvement SBLT present Data to assess commitment

1 2 3 4 5

0

1

2

3

Sunshine Elementary: Self-Assessment of Problem Solving Implementation (SAPSI) Data Consensus Building

BOY

EOY

Item Description

Indi

cato

r Sta

tus

3= Maintaining2= Achieved1= In Progress0= Not Started

Page 23: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Example of an MTSS Evaluation Model

Page 24: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Table Top Activity

• Mock Small-Group Planning and Problem-Solving Process

Page 25: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Small-Group Planning and Problem-Solving Process

1. What is our desired goal?

2. Brainstorm the resources and barriers to achieving our goal

3. Select a barrier/group or related barriers to address first

4. Brainstorm strategies to reduce or eliminate our selected barrier

5. Develop an action plan to reduce or eliminate our selected barrier• Include who, what, when (Be specific!)

6. Develop a follow-up plan for each action• Include who, what, when

7. Develop a plan to evaluate the reduction or elimination of our chosen barrier

8. Develop a plan to evaluate progress towards achieving our goal from Step 1

Page 26: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Mock Small-Group Planning and Problem-Solving

1. Goal: Develop and implement a data-based evaluation system in my school and/or district

2. Brainstorm the resources and barriers to achieving our goal

3. Select a barrier/group or related barriers to address first

4. Brainstorm strategies to reduce or eliminate our selected barrier

Page 27: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Potential Data Sources

Page 28: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Perceptions of RtI Skills Survey

Assessing Perceptions of Skills Integral to PS/RtI Practices

Page 29: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Briefly…

• Role of survey data

• Beliefs Survey

• Perceptions of Practices Survey

Page 30: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Perceptions of Skills

The likelihood of embracing new practices increases when:

1) Educators understand the need for the practice

2) Educators perceive they either have the skills to implement the practice or will be supported in developing required skills

(Showers, Joyce, Bennett, 1987)

Page 31: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Description and Purpose

Perceptions of RtI Skills Survey

Page 32: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Perceptions of Skills—Description and Purpose

• Theoretical Background:• Assess educators’ perceptions of skills they possess to implement PS/RtI

• Understand perceptions of skills and how perceptions change as function of professional development to facilitate PS/RtI implementation

Page 33: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Description of Survey

• Assesses skills/amount of support needed for:• Applying PS/RtI practices to academic content• Applying PS/RtI practices to behavior content• Data manipulation and technology use

• 20 items; 5-point Likert scale

• 1= I do not have the skill at all (NS)…5= I am highly skilled in this area and could teacher others (VHS)

Page 34: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Purpose of Instrument

Purpose of the Perceptions of RtI Skills Survey:

1) Assess impact of professional development

2) Identify “comfort level” with PS/RtI practices to inform PD; allocate resources

Page 35: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Administration Procedures &

ScoringPerceptions of RtI Skills Survey

Page 36: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Administration procedures-Intended Audience

• Who should complete?• SBLT members• Instructional staff

• Who should use results?• SBLTs• DBLTs

Page 37: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration

• Methods for administration/dissemination• Completed individually• Anonymity• Opportunity for questions

• Role of school principal—explain the “why”

• Role of RtI coach/coordinator/SBLT member

• Frequency of use: resources, rationale, recommendations

Page 38: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Scoring

Two techniques to analyze survey responses:

1) Mean rating for each item calculated to determine average perceived skill level

2) Frequency of each response option selected calculated for each item

Page 39: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Calculating Item Mean

• Overall assessment of perceived skills of educators within a school/district

• Can be done at domain(factor) and/or individual item level

• Domain level: examine patterns in perceived skills re: academic content, behavior content, data manipulation/technology use

• Item level: identify specific skills staff perceive possessing v. skills in need of support

Page 40: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Calculating Frequency of Response Options

• Provides information on range of perceived skill levels

• Can be used to determine what percentage of staff may require little, some, or high levels of support to implement PS/RtI

• Informs professional development decisions

Page 41: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Answering Evaluation Questions

• Use data to inform evaluation questions

• Use data to answer broad/specific questions

• Align analysis and data display with evaluation questions

• Consider available technology resources to facilitate analyses of data—online administration, automatic analysis, knowledge and skill of personnel

Page 42: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Technical AdequacyPerceptions of RtI Skills Survey

Page 43: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Technical Adequacy

Content validity:

• Item set developed to represent perceived skills important to implementing PS/RtI

• Reviewed by Educator Expert Validation Panel (EEVP)

Construct validity:

• Factor analysis conducted using sample of 2,184 educators

• Three resultant factors

Page 44: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Technical Adequacy (cont.)

Internal Consistency Reliability:

• Factor 1 (Perceptions of RtI skills applied to academic content): α = .97

• Factor 2 (Perceptions of RtI skills applied to behavior content): α = .97

• Factor 3 (Perceptions of Data Manipulation and Technology Use Skills): α = .94

Page 45: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Interpretation and use of dataPerceptions of RtI Skills Survey

Page 46: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Interpretation & Use of Data

• Three domains:• Perceptions of skills applied to academic content• Perceptions of skills applied to behavior content• Perceptions of data manipulation and technology use skills

• Three methodologies:• Calculate mean at domain level• Calculate mean at item level• Frequency/percentage of who selected each response

option

• Identify specific skills/skills sets for PS/support

Page 47: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

14a.

Gra

ph ta

rget

stud

ent d

ata

14b.

Gra

ph b

ench

mar

k da

ta

14c.

Graph

pee

r dat

a

14d.

Dra

w an

aim

line

14e.

Dra

w a tr

endl

ine

15. I

nter

pret

gra

phed

PM d

ata

to d

eter

min

e st

uden

t RtI

19. D

isag

greg

ate

data

by

vario

us d

emog

raph

ic fa

ctor

s

20a.

Acc

ess in

terv

entio

n re

sour

ces v

ia th

e In

tern

et

20b.

Use

PDAs

to co

llect

dat

a

20d.

Use

the

SWIS

for P

BS

20e.

Gra

ph a

nd d

ispl

ay st

uden

t and

sch

ool d

ata

21. F

acili

tate

a P

robl

em S

olvi

ng T

eam

mee

ting

.0

10.0

20.0

30.0

40.0

50.0

60.0

70.0

80.0

90.0

100.0

Perceptions of RtI Skills Survey: Item Response Data Factor Three (Data manipulation skills)

Very Highly SkilledHighly SkilledSome Support NecessaryMinimal SkillsNo Skill at all

Constituent Item / Overall Factor

Perc

enta

ge o

f Tota

l R

esponses

Page 48: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Interpretation & Use of Data (cont.)

• Sharing data with stakeholders:• DBLTs, SBLTs, instructional staff

• Use data to:• Develop/adjust PD goals• Design training/coaching activities• Facilitate consensus-building discussions re:

rationale for PD, patterns, barriers

Page 49: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Facilitating Discussions

Sample guiding questions…

• To what extent do you believe your school possesses the skills to use school-based data to evaluate core instruction (Tier 1)? Supplemental instruction (Tier 2)?

• Based on what staff has learned about data-based decision-making, how consistent are those skills with PS/RtI practices (i.e., to what degree do teams evaluate the effectiveness of core and supplemental instruction?

Page 50: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.
Page 51: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Table Top Activity

With a partner, examine the data graph for Alligator Elementary:

1) After the BOY administration of the Skills survey, what conclusions could be made about the level of support needed by staff to apply PS/RtI practices to behavior content?

2) Compare changes in skills levels from BOY to EOY. Then compare items suggesting substantial growth to items suggesting little or no growth. What decisions could be made re: professional development and needed support?

Page 52: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Implementation Integrity

Page 53: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

What is “Integrity”and why is it important?

• Integrity is the degree to which something was done the way it was intended to be done.

• When a process or procedure lacks “integrity,” few if any assumptions can be made about the outcome or impact of that process or procedure.

Page 54: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Tools to Measure Implementation Integrity

• SAPSI

• Tier I and II Critical Components Checklist

• Tier III Critical Components Checklist

• Tier I and II Observation Checklist

• Problem-Solving Team Meeting Checklists (Initial and Follow-up)

Page 55: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Monitoring implementation to guide systemic decision-making

SAPSI (SELF-ASSESSMENT OF PROBLEM SOLVING IMPLEMENTATION)

Page 56: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Description and Purpose

SAPSI

Page 57: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Theoretical Background

• Assesses extent to which schools are making progress toward full implementation

• Implementation is a gradual progress

• Many reform efforts fail due to:

LACK OF IMPLEMENTATION

(Sarason, 1990)

• Implementation integrity must be examined

Page 58: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Description

• Self-report measure

• Organized around system’s change model: consensus, infrastructure, implementation

• Collaboratively completed by SBLTs (School-based Leadership Teams)

• Response options: (N) Not Started, (I) In Progress, (A) Achieved, (M) Maintaining

Page 59: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Florida’s Change Model

Consensus

Infrastructure

Implementation

Page 60: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Change Model

• Consensus• Belief is shared• Vision is agreed upon• Implementation requirements understood

• Infrastructure Development• Regulations• Training/Technical Assistance• Tier I and II intervention systems

• E.g., K-3 Academic Support Plan

• Data Management• Technology support• Decision-making criteria established

• Implementation

Page 61: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Purpose

Two-fold:

1. Assess current level of implementation

In what areas do we need to take action in order to facilitate implementation?

2. Progress monitor implementation

How successful have our actions been? What systemic needs still exist?

Page 62: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Administration Procedures &

ScoringSAPSI

Page 63: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Intended audience

• School-based leadership team (SBLT)

• 6-8 member multi-disciplinary team

• Leadership role in facilitating implementation

• Trained on PS/RtI and Systems Change

• Roles and responsibilities

i.e., facilitator, time-keeper, data coach, recorder, content area

expertise

Page 64: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Directions for Administration

STEP 1: Ensure understanding

- Facilitator ensures content and format understood

- SBLT receives info re: purpose, what is measured, how data are used, completion procedures

Page 65: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Directions for Administration

(cont.)

STEP 2: Individual preview

- Approximately one week prior to completion distribute copy to each SBLT member

- Members complete individually

- Members record their perspective and prepare to contribute to discussion

Page 66: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Directions for Administration (cont.)

STEP 3: Group completion

- Facilitator guides discussion and records group responses

- Consensus reached on each item

- Completion takes 30 min. – 2 hours

Page 67: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Directions for Administration (cont.)

“N” – Not started = occurs less than 24% of time

“I” – In progress = occurs approx. 25-74% of time

“A” – Achieved = occurs approx. 75-100% of time

“M” – Maintaining = rated “achieved” last time and continues to occur 75-100% of time

Page 68: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring

OPTIONS FOR ANALYSIS:

1. Avg. activity level across domains or by item

-What are the general patterns of change?

-To what extent are staff engaging in specific activities?

2. Frequency of each response option

-What percentage of schools are engaged in specific activities?

Page 69: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring (cont.)

AVERAGE ACTIVITY LEVEL by domain

• Examine general patterns in consensus, infrastructure, implementation

• A domain score is calculated for each of the three domains

Page 70: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Sum of ratingsof items in domain(domain score)

÷Total number of items= AVG. ACTIVITY LEVEL FOR DOMAIN

SAPSI: Scoring (cont.)

Calculating Average Activity Level(domain level)

Page 71: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring (cont.)

Items That Comprise Each Domain

• Domain 1 (Consensus): Items 1-5

• Domain 2 (Infrastructure): Items 6-20

• Domain 3 (Implementation): Items 21a-27

Page 72: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring (cont.)

RESPONSE OPTION VALUES:

“N” – Not started = 0

“I” – In progress = 1

“A” – Achieved = 2

“M” – Maintaining = 3

Page 73: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring Example

Domain 1

1. N = 0 6 ÷ 5 = 1.2 (Avg. Activity Level for Domain 1)

2. I = 1 On average, “in progress” with consensus building

3. I = 1

4. A = 2

5. A = 2

Page 74: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring (cont.)

AVERAGE ACTIVITY LEVEL at the by item

• Identify extent to which educators are engaging in specific activities

• Identify activities that need to be addressed systemically

• Does NOT provide information re: variability among schools for each activity

Page 75: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Scoring (cont.)

FREQUENCY OF EACH RESPONSE OPTION

• Range of activity levels across schools

• Determine percentage of schools engaged in specific activities

• Gauge the magnitude of the problem (All schools? Some? Few?)

Page 76: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

QUESTIONS DRIVE YOUR ANALYSES

What are the general trends in change stages across the district over time?

(avg. activity level by domain)

Which activities should be systematically addresses through PD or district policy?

(avg. activity level by item)

How can we most efficiently deploy resources to address school needs?

(frequency of response option)

Page 77: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Technical Adequacy SAPSI

Page 78: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Content Validity

• Evidenced by careful identification and definition of measured contentA. As reflected in systems change literature

B. Based on review on instruments that purport measurement of identified domains

• Adapted from IL-ASPIRE SAPSI v. 1.6• Matched to Florida systems change model• Modified to align with Florida’s PS/RtI Model

Page 79: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Internal Consistency Reliability

• Computed separately for each of the three domains

• Utilized SAPSIs administered to 34 pilot schools in Winter 2010

• Cronbach’s alpha coefficient: • Consensus: α = .64• Infrastructure: α = .89• Implementation: α = .91

Page 80: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Interpretation and use of data

SAPSI

Page 81: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Examining the Broad Domains

• Examine the three broad domains first(i.e., consensus, infrastructure, implementation)

• Graphs used to examine levels

• Previously mentioned scoring methods used

• Frequency of response option often used by Project to examine aggregate pilot school data

Page 82: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Y1_B

OY

Y1_E

OY

Y2_M

OY

Y2_E

OY

Y3_M

OY

Y3_E

OY

Y1_B

OY

Y1_E

OY

Y2_M

OY

Y2_E

OY

Y3_M

OY

Y3_E

OY

Y1_B

OY

Y1_E

OY

Y2_M

OY

Y2_E

OY

Y3_M

OY

Y3_E

OY

Y1_B

OY

Y1_E

OY

Y2_M

OY

Y2_E

OY

Y3_M

OY

Y3_E

OY

Y1_B

OY

Y1_E

OY

Y2_M

OY

Y2_E

OY

Y3_M

OY

Y3_E

OY

1. District com-mitment

2. SBLT support 3. Faculty involvement 4. SBLT present 5. Data to assess commitment

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

Aggregate Florida PS/RtI Project Pilot Schools Self-Assessment of Problem Solving Implementation

(SAPSI) Consensus

Not Started

In Progress

Achieved

Maintaining

Item

Perc

ent

of

Schools

Page 83: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

1. District Commitment 2. SBLT support 3. Faculty involvement 4. SBLT present 5. Data to assess com-mitment

.00

1.00

2.00

3.00

SCHOOL LEVEL DATASelf-Assessment of Problem Solving Implementation (SAPSI)

Consensus

Year 1_BOY

Year 1_EOY

Year 2_EOY

Year 3_EOY

Year 4_EOY

Item

Stat

us

3= Maintain-ing2= Achieved1= In Progress0= Not Started

Page 84: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Identification of Specific Needs

• Graph items to identify trends and which activities are engaged in more/less frequently

• Consider various factors when examining levels of activity engagement (e.g., training, length of implementation, SEA/LEA policies)

• Self-report data valuable, but positively biased –compare with other implementation data

Page 85: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Sharing Data with Stakeholders

• Scale-up practices should include a plan for dissemination, analysis and discussion

• Identify key stakeholders [e.g., instructional staff, SBLT, DBLT (District-based Leadership Team)]

• Share data quickly and frequently

Page 86: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Sharing Data with Stakeholders

(cont.)

• SBLTs – use data to strategize, develop/alter goals, update instructional staff

• DBLTs – use data to inform district level support and policy

• Stakeholder support/input is critical for effective action planning

Page 87: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Using Guiding Questions

• What are the patterns? • What patterns are evident among each of the individual

items on the checklist and across all data sources?• What steps of the problem-solving process are

occurring more frequently? Less frequently?• Are there any current indicators that show a zero or low

level of implementation? Why? • Have these been targeted in the past? • Do barriers exist with consensus or infrastructure? • Other priorities? • Meetings not happening or not focusing on

implementation?

Page 88: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

SAPSI: Using Guiding Questions (cont.)

• How have you progressed in implementing the Problem-Solving Model with fidelity?• Looking across all fidelity measures (CCC, SAPSI,

and Observations), what are the general levels of implementation? What are the general trends?

• Do the data from the Critical Component Checklist and Observations support what is evident in the SAPSI items 22a-22i? • Are there discrepancies among the different sources

of data with using the Problem-Solving model?• How might these discrepancies be interpreted?

Page 89: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

School-wide Data Example

Page 90: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Table Top Activity

• Think about one school that you are currently serving

• Take a few minutes to complete the “Consensus” section of the SAPSI

• Share out

Page 91: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Tiers I and II Critical Components

Checklist

Page 92: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.
Page 93: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.
Page 94: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Description and Purpose

Tiers I and II Critical Components Checklist

Page 95: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Theoretical Background

• Implementation of new practices is a gradual process that occurs in stages, not a one-time event (Fixen, Naoom, Blasé, & Wallace, 2005).

• Since many educational reform initiatives fail due to lack of implementation (Sarason, 1990), it is critical to examine implementation integrity

• Several methods for examining implementation integrity exist (Noell & Gansle, 2006)

• Self-report• Permanent product reviews• Observations

Page 96: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Description

• Permanent product review

• Measures the extent to which components of the PS/RtI process are evident in permanent products from data meetings addressing Tier I and/or Tier II content

• 11 items organized around the 4-step problem-solving process1. Problem identification

2. Problem analysis

3. Intervention development and implementation

4. Program evaluation/RtI

• Response options: 0=Absent, 1=Partially present, 2=Present (N/A for some items)

Page 97: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Problem-Solving Process

EvaluateResponse to

Intervention (RtI)

Problem Analysis

Validating ProblemIdentify variables that contribute to problem

Develop plan

Define the ProblemWhat do we want the student(s)

to know and be able to do?

Implement PlanImplement As Intended

Progress MonitorModify as Necessary

Page 98: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Tiered Model of School Supports & the Problem-Solving Process

ACADEMIC and BEHAVIOR SYSTEMS

Tier 3: Intensive, Individualized, Interventions.

Individual or small group intervention.

Tier 2: Targeted, Strategic Interventions & Supports.

More targeted interventions and supplemental support in addition to the core curriculum and school-wide

positive behavior program.

Tier 1: Core, Universal Instruction & Supports.

General instruction and support provided to all students in all

settings.

Revised 10.07.09

Page 99: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Purpose

• To provide stakeholders with a practical methodology for evaluating the extent to which educators implement PS/RtI practices in data meetings addressing Tier I and /or II content

• Permanent product reviews typically more reliable than self-report, but more resource-intensive

Page 100: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Administration Procedures and Scoring

Tiers I and II Critical Components Checklist

Page 101: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Who should complete the checklist?

• The person completing Tiers I and II CCC should have expertise in PS/RtI model and conducting permanent product reviews• Specifically, the 4 steps of the problem-solving

process

• PS/RtI Coaches, school psychologists, literacy specialists, etc.

Page 102: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Who should use the results for decision-making?

• School-Based Leadership Team (SBLT)• SBLT should take a leadership role in

implementing PS/RtI in their school• SBLT should have representation across staff• SBLT members should receive training in

PS/RtI

• District-Based Leadership Team (DBLT)

Page 103: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration

• Step 1• Identify the content areas and grade levels

being targeted by the school(s) for which the Tiers I and II CCC is being completed

• It is recommended that the checklists be completed from products derived from Tier I and II data meetings related to the goals of the school

Page 104: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration (cont.)

• Step 2• Identify when Tier I and II data meetings occur and

who is involved in the meetings• Examples of common meetings include leadership

team meetings, grade level meetings involving teachers, team meetings, and meetings during which small-group interventions are planned

• Meetings focused on Tier I instruction typically occur 3-4 times per year, more frequently for Tier II instruction

• Tier I and II CCC is not completed for meetings in which individual student problem-solving occured

Page 105: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration (cont.)

• Step 3• Find out who to contact for permanent

products that come from identified meetings and what products will likely be available

Page 106: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration (cont.)

• Step 4• Gather any relevant document for the period of

time for which the checklists are being completed• Reviewers may choose to complete the Tier I and

II CCC to align with universal screening windows• Example) Universal screening data collected 3 times

per year, then Tier I and II CCC could be completed from the products derived from each data meeting

• Once the time frame is identified, permanent products from Tier I and II data meetings can be reviewed

Page 107: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration (cont.)

• Step 5• Complete the checklists using the Tier I and II

CCC Standard Scoring Rubric• Rubric provides criteria for how to score each

item

• Recommended to complete checklist for each target area and grade level the school is targeting

• Important that those completing the checklist have knowledge of the problem-solving process

Page 108: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Directions for Administration (cont.)

• Step 6• Complete inter-rater procedures when applicable• Ensuring that permanent product reviews are

completed accurately is critical to data collection• Periodically, have two reviewers complete a Tier I

and II CCC using products from the same meeting and compare results

• Allow reviewers to compare notes and discuss differences

• Inter-rater procedures frequency dependent on time and resources of reviewers

Page 109: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Frequency of Use

• Consider resources available, including the time needed to:• Complete the instrument• Enter, analyze, graph, and disseminate data• Personnel available to support data collection• Additional data collection activities SBLT members and

school staff participate in

• General recommendations• Data collection aligned with school’s target content areas

and grade levels• Aligned with the frequency of universal screening and

progress monitoring data

Page 110: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Scoring

• Examples of two data analysis techniques1. Calculate the mean rating for each item

2. Frequency distribution of each response option selected (i.e., Absent, Partially present, and Present)

• Four domains1. Problem Identification (Items 1-3)

2. Problem Analysis (Items 4-5)

3. Intervention Development and Implementation (Items 6a-7c)

4. Program Evaluation/RtI (Items 8-11)

Page 111: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Scoring (cont.)

1. Calculating the mean rating for each item• Provides an overall impression of implementation• Allows for examination general patterns of

implementation

2. Frequency distribution of each response option• Provides information on range of

implementation• Can be used to determine what percentage of

schools or grade levels implemented, partially implemented, or did not implement components of PS/RtI

Page 112: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

1. D

ata

to d

eter

min

e eff

ectiv

enes

s of

cor

e

2. D

ecisio

ns m

ade

to m

odify

cor

e or

dev

elop

inte

rven

tions

3. U

nive

rsal

scr

eeni

ng u

sed

to id

gro

ups ne

edin

g in

terv

entio

n

4. Tea

m u

ses hy

poth

eses

to id

entif

y re

ason

s fo

r not

mak

ing

benc

hmar

k

5. D

ata

used

to d

eter

min

e hy

poth

eses

for n

ot m

akin

g be

nchm

ark

6a. M

odifi

catio

ns m

ade

to cor

e in

stru

ctio

n - P

lan

docu

men

ted

6b. M

odifi

catio

ns m

ade

to cor

e in

stru

ctio

n - S

uppo

rt d

ocum

ente

d

6c. M

odifi

catio

ns m

ade

to cor

e in

stru

ctio

n - I

mpl

emen

tatio

n do

cum

ente

d

7a. S

upp.

inst

ruct

ion

deve

lope

d or

mod

ified

- Pla

n do

cum

ente

d

7b. S

upp.

inst

ruct

ion

deve

lope

d or

mod

ified

- Sup

port d

ocum

ente

d

7c. S

upp.

inst

ruct

ion

deve

lope

d or

mod

ified

- Im

plem

enta

tion

doc.

8. C

riter

ia fo

r pos

itive

RtI

were

defin

ed

9. Pro

gres

s m

onito

ring

data

sch

edul

ed/col

lect

ed

10. D

ecisio

n re

gard

ing

stud

ent R

tI was

doc

umen

ted

11. P

lan

to con

tinue

, mod

ify, o

r ter

min

ate

inte

rven

tions

pro

vide

d

.00

1.00

2.00

Tier I/II Critical Components Checklist: Mean Item Response Data

2005-20062006-20072007-20082008-2009

Item

Avera

ge L

evel of

Imple

menta

tion

0 = Absent1 = Partially Present2 = Present

Problem Identification Problem Analysis

Intervention Development and Implementation Program Evaluation/RtI

Page 113: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Technical AdequacyTiers I and II Critical Components Checklist

Page 114: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Content Validity

• Review of relevant literature, presentations, instruments, and previous program evaluation projects to develop an item set representative of the critical components of PS/RtI implementation

Page 115: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Inter-Rater Agreement

• Ability of reviewers to provide reliable data has been supported by inter-rater agreement among PS/RtI Project Coaches completing the instrument• Inter-rater agreement = # agreements/total #

items• Average inter-rater agreement = 91.16%

Page 116: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Interpretation and Use of the DataTiers I and II Critical Components Checklist

Page 117: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Examination of Broad Domains

• Start by examining broad domains to evaluate the extent to which permanent products indicate PS/RtI practices are being implemented

• Examining the data graphically allows for educators to determine the extent to which the major steps of problem-solving are occurring

• Examine implementation levels at each time point, as well as trends over time

Page 118: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Identification of Specific Needs

• Tiers I and II CCC can be used to identify which components of problem-solving are more vs. less evident

• Consider what training educators have received and how long implementation efforts have been occurring

• Stakeholders can use this data to identify components of the problem-solving process that require additional support to be implemented• Professional development• Policies and procedures

• Important to consider all aspects of the school/district system that might contribute to implementation

Page 119: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Dissemination to Stakeholders

• Important to disseminate implementation data to key school and district stakeholders as quickly and frequently as possible

• Allow for stakeholders to discuss implementation levels, develop/alter implementation goals, and design strategies to increase implementation

Page 120: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Dissemination to Stakeholders (cont.)

• Guiding questions• What are the patterns?

• What patterns are evident among each of the items?• What steps of the PS process are occurring more/less

frequently?

• Are there indicators that show zero implementation? Why?• Have these been targeted in the past?• Do barriers exist with consensus or infrastructure?• Other priorities• Meetings not happening or focusing on

implementation?

Page 121: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Dissemination to Stakeholders (cont.)

• Guiding questions (cont.)• How have you progressed in implementing the

PS model with fidelity?• Looking across all fidelity measures (CCC, SAPSI,

Observations), what are the general levels of implementation? What are the general trends?

• Do the data from the CCC and Observations support what is evident in the SAPSI items 22a-22i?• Are there discrepancies among different data sources

with using the PS model?

• How might these discrepancies be interpreted?

Page 122: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Table Top Activity

• Discuss the critical pieces of information that you would want to communicate to your SBLT and administrators related to evaluation • (2-3 minutes then report out)

Page 123: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Discussion

• What are you currently doing to examine these areas in your district or school?• What are the critical questions you ask?• What data sources do you have to answer them?• What questions do you already have that you cannot answer with

available data?• How do you use the data you collect to inform decisions?

• What areas need to be addressed as you return to your districts to plan? What are the priorities?• What critical questions do you need to start asking?• What data sources do you need?• How can you better use the data to inform decisions?

Page 124: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Additional Resources

Page 125: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Floridarti.usf.edu

Page 126: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Flpbs.fmhi.usf.edu

Page 127: Systematic Evaluation Model to Ensure the Integrity of MTSS Implementation in Florida 2011 FASP Conference November 4 th, 2011 Kevin Stockslager Kelly.

Thank you!!!

• Kevin Stockslager• [email protected]

• Beth Hardcastle• [email protected]

• Kelly Justice• [email protected]