External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication...

16
External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum

Transcript of External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication...

Page 1: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

External Evaluation of the 2011 – 2014

Demonstration ProjectPresented at October 2013 Replication Forum

Page 2: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

External Evaluators

• The Meadows Center for Preventing Educational Risk• PI: Dr. Saro Mohammed• Researchers: Myriam Lopez, Deborah

Van Kummer• Concordia University• Site visits: Students in the Educational

Administration Master’s program

Page 3: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Logic Model

Page 4: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Theory of Change – Regional Level

[Funders, partners, Region 13 ESC, and participating schools] [train and support PDers and coaches] to [change the number of regional Pders, conferences, and schools] leading to [regional collaboration between PDers, districts, and schools] and eventually [embedding, awareness, and use of SIM regionally]

Page 5: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Theory of Change – School Level

[Teachers of struggling students – highly mobile, economically disadvantaged, with limited English proficiency, experiencing achievement gaps in reading] [receive training, feedback, support, and implement SIM] to [change the number of classes, and students using SIM] leading to [teacher collaboration, student engagement, academic achievement, and accurate SLD referrals] and eventually [multidisciplinary student use of SIM, and positive student behaviors, and high school outcomes]

Page 6: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

• Based on detailed evaluation logic model• Schools evaluated on:

Outputs:• Process metrics (Teachers trained, reviews, etc.)• Implementation fidelity (practices observed in

walk-throughs, student feedback, etc.)Outcomes:• Change over year for struggling learners• TAKS/STAAR scale score comparison for all

students, and raw scores for struggling students, versus comparison schools matched on size, demographic make-up, previous results

Evaluation Methods

Page 7: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Data Sources

• All data collected by program staff EXCEPT• Site visits: classroom walkthroughs,

device checklists, LLT meeting observations

• State Assessments: TAKS 2011, STAAR 2012, STAAR 2013

Page 8: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Outputs – Fidelity of Implementation (School Level)

• Implementation has improved over 2 years

Page 9: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Outputs – Fidelity of Implementation

• Implementation is widespread

89

44

148 4 3 211

Observed Routines

Unit OrganizerFramingCourse OrganizerConcept ComparisonQuestion ExplorationConcept MasteryLesson OrganizerVocabulary LINCingClarifying

24

38

29

29

712 4 16

Observed Content Areas

MathLanguage ArtsScienceSocial StudiesFine ArtsPE/HealthWorld LanguagesCareer/Tech EducationOther

Page 10: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Outcomes

• Populations (defined in Fall 2011)• Struggling learners (project schools

only)• Gates standard score of 85 or less• Pre and Post test scores (typically

beginning and end of year)• All students• Took regular TAKS & STAAR (not

modified versions of tests)

Page 11: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Findings – Reading (Struggling students, Gates)

Page 12: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Findings – Reading (Struggling students, Gates, Year 1)• For students identified as struggling in

year 1, percentile changes in Gates from pre-test to post-test were notable:• 6th grade growth=3rd to 6th percentile;

n=138• 7th grade growth=4th to 10th percentile;

n=124• 8th grade growth=5th to 9th percentile;

n=104

Page 13: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Findings – Reading (Struggling students, Gates, Year 2)• Also, for students identified as struggling

in year 2, percentile changes in Gates from pre-test to post-test were notable:• 6th grade growth=6th to 12th percentile;

n=118• 7th grade growth=2nd to 7th percentile;

n=209• 8th grade growth=5th to 6th percentile;

n=154

Page 14: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Comparison Schools

• Created a “focal, local, comparison group”• Schools were matched on (in order):

number of students, Eco Dis percentage, bilingual/LEP percentage, mobility percentage, ethnic makeup of student population, historical TAKS

• Match schools kept within district where possible

• All match schools were within Region 13

Page 15: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Findings – Reading (All students)

• No significant effects on reading yet on schools overall• Posttest (STAAR 2012 & STAAR 2013)

means adjusted for pretest (TAKS 2011 & STAAR 2012 respectively) for 7th and 8th graders

• Pooled standard deviations and posttest adjusted means were used where available

• Within-grade effect sizes ranged from -0.09 to 0.05

Page 16: External Evaluation of the 2011 – 2014 Demonstration Project Presented at October 2013 Replication Forum.

Evaluation Summary

• Implementation process: project schools are being trained/supported in their implementation as intended

• Output metrics (PD goals, practice usage) generally being achieved and acceptably consistent across schools

• For struggling students, trends are positive and noteworthy• On proximal measures of reading, students who

continue to struggle from year to year outpace expected annual growth (as determined by national norms)

• In first and second years of implementation, as expected, no statistical difference in distal outcomes (state assessments) between project and match schools

• Student academic growth much greater in most RAISEup schools versus comparison schools