... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium...

33
Teacher Evaluation in New York State . . . . . . and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012

Transcript of ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium...

Page 1: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Teacher Evaluation in New York State . . .

. . . and what it means for teachers of non-tested subjects

Johanna J. Siebert, Ph.D.NAfME Symposium on Assessment

June 24-25, 2012

Page 2: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

NYS InitiativesCommon

Core StandardsWhat do we want

our students to know and be able to do?

Data Driven

InstructionHow will

we know?

APPRAnnual Professional Performance Review

Ensuring high quality

instruction in every

classroom

Page 3: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Annual Professional Performance Review

Inspired by Race to the Top Legislation New APPR a condition of the award Some portions of evaluation process negotiated between

the district and its teacher union Some portions state-mandated Evaluation process results in teacher “HEDI” score

◦ Highly Effective, ◦ Effective, ◦ Developing, ◦ Ineffective

Can lead to expedited 3020-A process for teacher termination

APPR – A 100 – Point System

Page 4: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

60 Points

•NYS, National, and/or District Teaching Standards

•Multiple Supervision Models, including performance rubrics

•Observations, surveys, evidence of Student Learning

20 Points:

Studen

t Growt

h

•Growth on State Assessments – State provided score for grades 4-8 ELA, Math OR

•Growth Using Comparable Measures – Student Learning Objectives (SLOs)

20 Points: Local Assessment

•Student Achievement – locally determined measures cross grade-levels, teams, building

•Can use third party State-approved assessments – can measure growth, also

•District, BOCES developed assessment (rigorous, comparable)

Multiple Measures of Effectiveness for Teacher Evaluation

Page 5: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

NY – districts can make individual decisions regarding:◦ Specific supervision model to be used◦ Priorities and academic need◦ Which subjects/teachers will use state-provided

ELA/Math scores and which will have SLOs◦ In-house processes for SLO assessing, scoring,

implementation

Other States: Similar conditions Entire state interprets uniformly

District-wide Decision Making

Page 6: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

The First 60%

Page 7: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Select a teacher practice rubric from the State-approved list or apply for a variance

Danielson’s Framework for Teaching Marzano’s Causal Teacher Evaluation Model NYSTCE Framework for the Observation of

Effective Teaching NYSUT Teacher Practice Rubric

Collective Bargaining considerations

Step 1

Page 8: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Agree on the definition of “classroom observation” and any additional measures in the 60 point category (40 pts must be multiple observations)

Choose one or more of the following other measures of teacher practice:

A portfolio or evidence binder (student work or teacher artifacts)

Feedback from students, parents, and/or other teachers using a survey

Professional growth goals using self reflection (maximum of 5 points)

Step 2

Page 9: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Observation = 2 learning walks (15-minute informal walk-through, follow-up conversation)

OR A formal class-length observation

Multiple “observations” needed (2)

Could be 2 class-length observations 1 class-length observation, 2 learning walks 4 learning walks

We negotiated . . .

Page 10: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

A portfolio or evidence binder (student work and/or teacher artifacts)

Professional growth goals using self-reflection (Professional Learning Plan, PLP)

WCSD selects 9 components from the 4 Domains

Teachers select an additional 5 components

And . . .

Page 11: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

“The governing body of each school district and BOCES is responsible for ensuring that evaluators have appropriate training—including training on the application and use of the rubrics—before conducting an evaluation.  The governing body is also responsible for certifying a lead evaluator as qualified before that lead evaluator conducts or completes a teacher’s or principal’s evaluation. ”

NYS Commissioner’s Regulations

Who evaluates whom?

Page 12: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Responsible for carrying out observations, summative evaluations

Must be trained and calibrated across each school district in selected model

Knowledge of model Walk-through, observation protocols Evidence-based reports Use, knowledge of specific rubrics Forms, feedback for teachers Professional Learning Plans

Lead Evaluators

Page 13: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Districts design a plan for: Training for all evaluators Certification for lead evaluators Role clarification Subcomponent and overall scoring Improvement plans Knowledge of appeals procedures (i.e.,

NYSED model appeals procedure in guidance)

How do districts calibrate supervision?

Page 14: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Develop Professional Learning Plan (PLP)

Attention to multiple professional areas (4 Domains) Preparation Classroom Environment Instruction Reflection, Professional ResponsibilitiesStudent-Centered Aspects Individual SLOs/Student Growth Student achievement Data-driven instruction Select observation protocol

◦ Traditional observation (class length)◦ Walk-through/Learning Walk

Teachers

Page 15: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

More frequent interactions between teacher/supervising administrator

Mid-October – describe and set PLP, content are SLO(s)

November-December – observation and follow-up

January – midterm check-in on PLP, SLO progress

February-March - observation and follow-up May-June – summative evaluation

conference

Proposed Calendar of Supervision

Page 16: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Districts are choosing specific models, scheduling and implementing administrator training

Administrators, teachers at various stages Learning new protocols Scheduling workshops Goal-setting Setting district calendars

To begin in September, 2012

Summer Work

Page 17: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Student Learning Objectives

The First 20%

Page 18: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

A student learning objective: Is an academic goal for a teacher’s students that is set

at the start of a course. Represents the most important learning for the year (or

semester, where applicable) Is specific and measureable based upon available prior

student learning data Aligned with Common Core, AND State or National

Standards, as well as any other school and district priorities

Represents growth from beginning to end of the course

Teachers’ scores are based upon the degree to which their goals are attained.

18

What IS a SLO?

Page 19: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

More on SLOs Need common assessments for individual

growth across grade levels, content 50% rule, applied to total student load Teacher sets individual growth targets per

student Cross-scoring of summative assessments

needed, to ensure equity in HEDI scoring (need for inter-rater reliability)

Page 20: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Any teacher who does not use a state growth measure (ELA/Math assessments, gr 4-8)

“non-tested” subjects (70%) 50% + of student load Full-credit courses carry more weight than

part-credit, or semester Teacher will likely have multiple SLOs Teacher tracks, monitors progress of each

student in SLO classes to impact growth

Who Needs a SLO, and How Many?

Page 21: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

For Growth, Start with EVIDENCE

Teacher sets individual student baseline using Historical data (ex., prior year’s grades) Pre-Assessment performance

Teacher predicts individual student growth in his/her course

Sets individual growth targets for students Post-assessment given at end of course (can be

state assessment) Data analysis yields success rate of students, and

teacher’s score on this section

HOW do you design the SLO?

Page 22: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

New York State AssessmentsThere are NO state assessments in the arts NO common opportunity-to-learn standards

Regional BOCES are sponsoring writing sessions to design SLOs and assessments in the arts

Local districts design, implement their own

Page 23: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

… are using ELA and/or Math state test scores IN PLACE OF

assessment data in non-tested subjects(the district-based SLO model)

Due to: Lack of common assessments Lack of inter-rater reliability Lack of content oversight by content specialist Lack of effective data system for monitoring and

tabulating results Ambitious timeline for implementation

Some districts…

Page 24: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

SLOs must include . . .

NYS Learning Objective per grade selected Specific population/grade level Learning content Interval of instructional time (full year,

usually) Evidence to be used/collected (three forms)

◦ Historical◦ Pre-assessment◦ Post-assessment

Individual students’ baseline

Page 25: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

continued. . . . Individual student targets (set by teacher) Teacher goal set Teacher scoring range, by HEDI ratings Rationale for the SLO and targets

Eventually Final individual student growth score % of students meeting individual targets Student % aligned with specific scoring

band for HEDI rating

Page 26: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Local Achievement Assessment

The Other 20%

Page 27: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Must be common across districts for grade level and content areas

Should represent summative measure of the course

Not to be applied to the SLO course(s) Teacher sets target for students Can NOT be scored by the teacher of record

Achievement Measures – the Last 20%

Page 28: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

. . . Between SLO and Achievement measures?

SLO involves setting a target for students based upon previous performance data, i.e. measuring students’ growth; applied to 50% of student load

Achievement does not measure “growth” over the length of the course, but teacher needs to set group target; applied to one other course

What is the difference . . .

Page 29: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Addition of Value-Added Growth Model

Inclusion of other data in targeting growth Demographic Graduation Attendance

Planned for 2013-2014 school year

Still to come

Page 30: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.
Page 31: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Plans to release individual teacher evaluation ratings to the public (HEDI)

Highly effective Effective Developing Ineffective

“Teachers evaluations can be viewed as the equivalent of a Carfax report, empowering parents to attempt to avoid the ‘lemons.’ “

B. Jason Brooks, Foundation for Education Reform and Accountability

Release of Teacher Evaluation Information

Page 32: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

June Determine next year’s SLO courses, populations Design pre-, summative assessmentsJuly and August Summer workshops, planning with like SLO teachers Calibrate scorers Design post-assessments, local common measures Administrators’ training in APPR forms, protocolsSeptember Meet students, get historical achievement data Administer, grade pre-assessments Set goal targets for students and self Meet with administration to review goals, etc.October Set SLOs, student targets Start applying strategies to gain student growth!

Timeline

Page 33: ... and what it means for teachers of non-tested subjects Johanna J. Siebert, Ph.D. NAfME Symposium on Assessment June 24-25, 2012.

Wish Us Luck!