Value Added for Teacher Evaluation in the District of Columbia

27
Value Added for Teacher Evaluation in the District of Columbia Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37 th Annual Conference March 16, 2012

description

Value Added for Teacher Evaluation in the District of Columbia. Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37 th Annual Conference - PowerPoint PPT Presentation

Transcript of Value Added for Teacher Evaluation in the District of Columbia

Value Added for Teacher Evaluation in the District of Columbia

Robin Chait, Office of the State Superintendent of EducationAnna Gregory, District of Columbia Public Schools

Eric Isenberg, Mathematica Policy Research

Association for Education Finance and Policy 37th Annual Conference

March 16, 2012

Statistical model predicts student achievement

Account for pretests, student characteristics

Ranks teachers relative to an average teacher

Value Added

Teacher value added =

Students’ actual end-of-year test scores –

Students’ predicted end-of-year test scores

2

Value Added in DCPS Evaluation System

Implementation requires sufficient capacity

Communication strategy is vital

Value added is worth the investment

Key Points

4

Where We Were in 2007

District of Columbia Public Schools | Summer 20115

8th grade reading proficiency (2007

NAEP)

Teachers meeting or exceeding

expectations

12% VS 95%

Why Value Added for DCPS?

Fairest way to evaluate teachers

Objective, data-based measure

Focused on student achievement

6

Value Added in DCPS Evaluation System

Individual value-added measures: 50 percent of eligible teachers’ IMPACT scores

IVA: Individual value added

TLF: Teaching and learning framework (classroom observations)

CSC: Commitment to school community

SVA: School value added

7

Highly effective: performance pay

Ineffective (one year): subject to separation

Minimally effective (consecutive years): subject to separation

8

IMPACT Is High Stakes

100 175 250 350 400

Overall Performance DistributionPPEP vs. IMPACT

District of Columbia Public Schools | Summer 2011

n=3,469

9

Value Added in DC

Date Value Added

2009 DCPS (trial run)

2009-2010 First year of IMPACT in DCPS

2010-2011 Second year of IMPACT in DCPS

October 2011 - present Third year of IMPACT in DCPS

First year of Race to the Top for DCPS and DC charter schools

10

Help for DC Public Schools

11

Mathematica Policy Research

Technical Advisory Board [2012]– Steve Cantrell, Gates Foundation– Laura Hamilton, RAND Corporation– Rick Hanushek, Stanford University– Kati Haycock, Education Trust– David Heistad, Minneapolis Public Schools– Jonah Rockoff, Columbia Business School– Tim Sass, Georgia State University– Jim Wyckoff, University of Virginia

Mathematica’s Work with DC Schools

Challenges

Consider face validity, incentive effects

Teacher-student link data can be challenging

All data decisions shared with district

Timeline must allow DCPS to transition out poor performers, hire new teachers

13

14

No One-Size-Fits-All Value Added Model

Choosing student characteristics: communications challenge for race/ethnicity

Multiple years of data: bias/precision trade-off

Joint responsibility for co-teaching– Cannot estimate model of separate teacher effects– Can estimate “teams” model, but should team

estimates count?

Comparing teachers of different grades

15

Roster Confirmation

Teacher-student links critical for value added

Administrative data can be challenging– Specialized elementary school teachers– Co-teaching– Pull-out and push-in programs– Midyear student transfers

Teachers surveyed to confirm administrative roster data (Battelle for Kids)

16

Business Rules: Documenting Data Decisions

Every data decision defined, discussed, documented beforehand

Let OSSE, DCPS review all decisions

Document entire process

Make quick progress when final data arrive

17

Production: Meeting Timelines, Ensuring Accuracy

October data: formulate business rules

February data– Establish data cleaning programs– Begin trial runs from analysis file to final output

April data: Final student data in trial runs

June (test score) data: produce final results

Perspective of State Education Agency

Race To The Top

19

Federal competition between states

Required student achievement to contribute 50% of teacher evaluation score

Decision to use DCPS value-added model for all eligible DC teachers

Brought DCPS and charter schools together

Each charter school LEA has own evaluation system used to inform personnel decisions

Common Decision-Making

Need to make decisions on value added–Quickly to meet production schedule

–Informed by best available data

–Obtains buy-in from charter schools and DCPS

Technical Support Committee (TSC)–Six members: five charter, one DCPS

–Meets periodically

–Consensus decisions sought

20

Data Infrastructure

Most data elements for value added exist

. . . but not necessarily collected on right schedule

Student background characteristics–Collected twice a year for AYP purposes

– Need three-time-a-year collection, earlier schedule for value added

21

Need Capacity Within District

Do not just hire a contractor

Need dedicated staff to answer questions–Data team

– Technical Support Committee

22

Communicating Results to DC Teachers

Communication Strategy

Value added hard to understand–Requires a strong statistical background

– Final information is hard to connect to familiar test scores

– Different from other student achievement measures teachers commonly use

Communication tools–Guidebooks

– Information sessions

24

What Factors Affect a Student’s Achievement?

25

Teacher’s Level of Expectations

Teacher’s Pedagogical Expertise

Teacher’s Ability to Motivate

Teacher’s Content Knowledge

Student’s Prior Learning

Student’s Disability (If Any)

Student’s English Proficiency

Student’s Resources at Home

Student Achieveme

nt

As Measured by the DC CAS

Value-added isolates the teacher’s

impact on student achievement.

Initiatives Under Development

Student-level output for DC teachers– Would show pretest, predicted posttest, actual

posttest score for each student

– May be in graphical format

Intermediate value-added scores– Individual value-added scores based on intermediate

tests

– Could be given to teachers midyear

26

Conclusions

Implementing value added requires . . . – Availability and accessibility of current data

– Confirmation of teacher-student links

– Careful planning of production process

– Sufficient capacity within local and/or state education agency to interact with value-added contractor

Teacher buy-in is not a given – communication strategy is vital

Properly implemented, value added is worth the investment

–Fairest measure of teacher effectiveness

–Provides data for answering research questions

27