Dpasii for Sboe June 2014 6-12-14

41
June 2014 Continuous Improvement: Proposed Revisions to DPAS-II for Teachers, Specialists & Administrators ~June 2014

description

DPAS II Power Point Presentation for Delaware State Board of Education, 6/12/14

Transcript of Dpasii for Sboe June 2014 6-12-14

June 2014

Continuous Improvement: Proposed Revisions to DPAS-II for Teachers, Specialists & Administrators

~June 2014

|

Overview• The Department’s History of Collaboration with Educators

• 2013-2014: Continued Track Record of Engagement

• Stakeholder Engagement Process: • Data Review (Summer/Fall)• Original broad scope of proposed changes (Winter) • Refined Proposed Regulations, 108A• Refined Proposed Regulations 106A, 107A

• Impact of Changes to 106A: • Ratings for Teachers • Impact on Students • Comparison to Proposed Regulation for Administrators

• Examples From the Field: Impact of Strong Evaluation Systems

2

|

DOE History of Collaboration with Educators

We fundamentally believe in the power of educators informing state policy decisions. Nowhere has this been more evident than in the realm of educator evaluation.

In addition to the many ways the DDOE has supported and improved the efforts focused on Components 1-4, the following are TWENTY significant examples of where the Department has listened to the feedback of DSEA and educators and been responsive regarding Component 5: 

3

|

Collaboration and Responsiveness to Educators: Examples1. Originally (2010), the Department intended to begin the implementation of Component V

with English & Mathematics only, seeking an incremental implementation for all other educators. DSEA advocated for the inclusion of all educators from the beginning of full implementation. The Department changed course and included all educators.

 2. By responding to the feedback of including all educators, and the feedback that everyone

should have multiple measures available, the Department (2011-2012) allocated resources to have the state’s educators build hundreds of assessments and growth goals. The state secured contractual help to aid in this process.

 3. The entire DPAS-II system with multiple measures needed to be implemented in the 2011-

2012 school year as the state had promised in its grant. DSEA and educators involved in building assessments cited implementation delays (multiple measures for everyone not ready) as a reason to not implement as promised in 2011-2012. The Department responded by delaying implementation for a full year via federal amendment.

 4. The Department listened to feedback and brought together over 600 educators (2011-2012)

to build the assessments in every grade and subject area. The result: Nearly 300 assessments built for educators, by educators.

 5. Since then, the Department has been asked to refine and improve the 300 measures, and

the Department has responded by doing so (2012-2013), with plans to continue refining all assessments in the future.

  4

|

Collaboration and Responsiveness: Cont’d (2)

6. The Department received feedback that additional supports were needed during initial implementation (2012). The Department created hotlines so that educators could call DDOE during the week, built online platforms to support implementation, created focus groups around the state, hosted Mid-Year feedback sessions with Secretary Murphy, and launched a revised annual evaluation that surveys every educator and asks for their feedback and opinion (over 4,000 educators responded).

 7. Initially the inclusion of a “school-wide measure” was the path chosen as part of

Component V (2011-2012). Initial input and feedback from educators and the state’s technical advisory group requested the inclusion of a school-wide measure to help foster collaboration within buildings. Several months later, opinion shifted around the benefits of the “school-wide measure”. The Department listened and removed the school-wide measure despite many educators still believing it was important.

 8. After this, DDOE considered allowing schools/districts to use a school-wide measure. But

DSEA stated that no school could ever use the total student body of a school because not every teacher had control of how all students perform. We listened and acted immediately, stripping the policy of the school-wide measure. (2012)

 9. With the removal of the school-wide measure, the Department refined its policy in and

created a Roster Verification System (which was field-tested by educators and had their feedback incorporated) that has served as the model for other states. (2012-2013)

 5

|

Collaboration and Responsiveness: Cont’d (3)10. As a result, educators and administrators now have the authority to determine for which

students they’re held accountable. The Department listened when allowing for such full autonomy in educators’ choosing which students would “count.” DDOE was criticized for not providing more guidance on how to verify student attendance. The Department, in response, has provided manuals with guidance. (2013) The Department has also been asked to not provide specific guidance on educator attendance.

 11. In the summer of 2012 an important decision needed to be made about what percentage of

students needed to meet their individualized student growth targets in order to be considered “Exceeds” or “Satisfactory”. The Department listened when setting the target at just 50% for “Satisfactory”, and further responded to concerns by creating a policy allowing for “administrator discretion” when between 35%-49% of students met targets. (2012)

 12. Even with discretion provided within one measure (DCAS), the Department listened to

feedback and went a step further to account for any perceptions of unfairness. Based upon DSEA feedback, a rating model was created whereby an educator can be rated “Unsatisfactory” in one measure and “Satisfactory” in the other—and still be “Satisfactory” overall. (2012)

 

6

|

Collaboration and Responsiveness: Cont’d (4)13. After year one of full implementation, there was feedback received by the Department that

targets for special education students were “too high”. The result: The Department allocated resources in the Summer of 2013 to build a new model for special education students, resulting in a new set of targets for 2013-2014 (2013)

 14. Based upon similar feedback over two years, the Department created separate targets for

English Language Learners as well. The Department also implemented a set of targets for students who were classified as both ELL & SPED. (2013)

 15. The Department received feedback that assessment schedules should be extended as late

as possible in the school year. We responded by extending the assessment opportunity and creating a technology solution to immediately provide data regarding student growth to teachers and administrators so they could engage in the summative conference at the end of the school year (2012-2013).

 16. Along the way, school leaders have also noted that more training and coaching may be

necessary. Originally the Development Coach initiative was slated for two-years. After listening to feedback, the Department has extended and funded the Development Coach initiative for a third and fourth year (2011-2015). We also integrated feedback from school leaders that the state’s training should be grounded in videos—all trainings now are. (2012-Present)

 

7

|

Collaboration and Responsiveness: Cont’d (5)17. Along the way, DSEA and other educators expressed that certain schools were not

implementing appropriately. The result: DDOE listened and launched an ambitious DPAS-II Monitoring process (2012)—monitoring 75 schools and district offices over the past eighteen months and providing feedback and directives. If a district’s rating is low, DDOE returns within six weeks.

 18. In the early part of the 2012-2013 school year, many educators began to cite assessment

quality (those built by teachers, namely) as a problem with the system. DDOE listened, and over the past year has allowed districts/schools/teachers to work collaboratively and submit their own assessments. (2013)

 19. Educators provided feedback to the Department that they were struggling with goal-setting.

The Department, based upon educator feedback, had granted all goal-setting prerogatives to schools and educators (2013). The Department responded by providing training focused on goal setting on an ongoing basis.

 20. In the latter part of the 2012-2013 school year, some educators began to say that there must

be a better way to construct the educator evaluation system. The Department listened, has supported legislation based upon this feedback, and has opened an application that allows for districts and charters to submit different evaluation systems (2013-2014).

8

|

Continuous Improvement: Engagement With Educators, 2013-2014

JulyDelaware Association of School Administrators - TLEU Meeting

AugustDelaware State Education Association - TLEU Meeting

SeptemberChiefs Meeting

Delaware Principals Advisory GroupDelaware State Education Association - TLEU Meeting

OctoberChief's Meeting

Delaware State Education Association - TLEU MeetingDelaware Teachers of the Year Statewide Advisory Board to the Secretary of Education

DPAS-II Review Committee - Teachers and SpecialistsTeaching and Learning Cadre Meeting

NovemberDelaware Principals Advisory Group

Delaware Association of School Personnel Administrators

9

|

Engagement – 2013-2014 cont’d (2)

DecemberDelaware Association of School Personnel Administrators

Delaware Principals AcademyDelaware Principals Advisory Group

Delaware State Education Association - TLEU MeetingDelaware Technical Advisory Group (DE-TAG)DPAS-II Review Committee - Administrators

DPAS-II Review Committee - Teachers and Specialists

JanuaryDelaware Principals Advisory Group

Delaware State Education Association - TLEU MeetingDelaware Teachers of the Year Statewide Advisory Board to the Secretary of Education

DPAS-II Review Committee - Teachers and SpecialistsDPAS-II Review Committee - Administrators

Mid-Year Conversations with Secretary Murphy

10

|

Engagement – 2013-2014 cont’d (3)February

Chiefs MeetingCommunity of Practice: Administrators

Delaware Association of School Personnel AdministratorsDelaware Principals Advisory Group

Delaware State Education Association - TLEU MeetingDPAS-II Review Committee - Administrators

DPAS-II Review Committee - Teachers and SpecialistsMid-Year Conversations with Secretary Murphy

Teaching and Learning Cadre Meeting

MarchCommunity of Practice: AdministratorsDACTE Summit on Teacher Education

Delaware Association of School Personnel AdministratorsDelaware Principals Advisory Group

Delaware State Education Association - TLEU MeetingDelaware Teachers of the Year Statewide Advisory Board to the Secretary of Education

DPAS-II Review Committee - AdministratorsDPAS-II Review Committee - Teachers and SpecialistsDPAS-II Review Committee - Teachers and Specialists

11

|

Engagement – 2013-2014 cont’d (4)

AprilCommunity of Practice: AdministratorsDelaware Principals Academy--Retreat

Delaware Association of School Administrators - TLEU MeetingDelaware Principals Advisory Group

Delaware State Education Association - TLEU MeetingDPAS-II Review Committee - Administrators

DPAS-II Review Committee - Teachers and Specialists

MayDelaware Principals Advisory Group

Delaware Technical Advisory Group (DE-TAG)DPAS-II Advisory Committee

DPAS-II Review Committee - Administrators

JuneDelaware Principals Advisory Group (Scheduled—June 16th)

DPAS-II Review Committee (Scheduled—June 24th)

12

|

Overview: Process To Reach Proposed Regulations

The Department’s original proposal to stakeholders encompassed a broad range of possible revisions to Regulations 106A, 107A and 108A. These changes were discussed in the DPAS-II Review Committees for Teachers & Specialists and Administrators.

After many discussions and collaborative work group sessions, and based upon years of feedback from the field, the Department proposed significant revisions to regulation 108A, making broad changes in evaluation of administrators.

After many discussions and work group sessions, the Department significantly reduced proposed revisions to regulations 106A and 107A, focusing on a few smaller changes.

I

13

|

Review and Feedback Process for 108AFeb 18•Review by DPAS II for Administrators committee of ideas to address in revision of 108A/system-wide

•Initial results/conversation regarding pilot rubric that was launched in August 2013 with PM Trainings

Feb 19 - Mar 25•Additional feedback from Committee members via email

•Feedback from Community of Practice (CoP) based on rubric review during pilot year

March 26•Review by Committee of first draft of regulation

•Community-of-Practice continues to engage on Principal Practice Rubric

•Chiefs informed of changes

Mar 27 – Apr 6•Additional feedback from Committee members via email

•USED informed of potential changes

Apr 7 Meeting•Review by Committee of revised draft of regulation

•Agreement by Committee on changes outlined in Regulation

Apr 8-14•Final revisions based on Apr 7 Committee feedback

•DDOE staff and legal review•Submission to the Registrar on 4/15

May 1-31•Regulation published for public comment

May 2•DPAS-II Advisory Committee meets to review proposed regulatory changes and have opportunity for comment to the State Board

May 15/June 19•Discussion by State Board of Education on regulation

•Action by State Board of Education on regulation

|

108A Final Proposed Revisions

15

|

1.0 Effective Date

The proposed amended regulation will be in effect beginning with the 2014-15 school year

References 1270(f) of Delaware code regarding alternative evaluation systems

|

2.0 Definitions

Student Achievement- Modified to clarify the use of data from new state assessments in administrator evaluations

DPAS II for Administrators Revised Guides- Modified to allow for creation of up to four guides tailored to different roles: (1) principals, (2) assistant principals, (3) central office administrators, and (4) superintendents are envisioned

Goal-Setting and Mid-Year Conferences- Modified terms to update language and clarify the purpose of these required steps in the appraisal process

|

3.0 Appraisal Process

Annual evaluation for all administrators - No more automatic differentiation based on administrator experience- Communicates the importance of continuous improvement for all administrators

Simplification of appraisal process- Elimination of “formative” assessment as a discrete step; modified milestones within system- Three requirements: (1) goal-setting, (2) mid-year conference, (3) summative appraisal

|

4.0 DPAS II for Administrator “Guides”

Broader set of administrators to be included in process of revising the Guides

Names required content of the Guides- Details about appraisal criteria (under each of the major components for evaluation)- Details about required and recommended steps in the appraisal process- Guidance related to evidence collection (to be further discussed in trainings)

|

5.0 Appraisal Components and Appraisal Criteria

“Appraisal Criteria” moved to the guides, not detailed in regulation

Titles and descriptions modified to allow for usage across multiple types of administrators:

- Vision and Goals- Teaching and Learning- People, Systems, and Operations- Professional Responsibilities

|

6.0 Summative Evaluation Ratings

Increased rigor for achieving “Highly-Effective” rating

Four levels for overall ratings : Highly Effective, Effective, Needs Improvement, and Ineffective

- Mirrors the Appraisal Component and Criteria rating system (unlike Teacher/Specialist System)

- Replaces the current binary system: Satisfactory & Unsatisfactory (at the Component-Level)

|

8.0 Improvement Plan9.0 Challenge Process

Description of improvement plan process and challenge process simplified

Improvement Plan process now mirrors the Improvement Plan process for 106A/107A

|

DOE’s Original Scope of Proposed Revisions to 106A/107A

• Allow “Observation Form” to be utilized upon request• Specify how C-IV performance data can be collected, e.g. at any time based

on observation• Create LEA flexibility for “substitute C-IV” (i.e. surveys)• Address/revise language in Component I• Address/revise language in Component III• Allow walk-through data to be included in process• “Needs Improvement” = “Unsatisfactory” Year• “Highly Effective” = 5/5, including “Exceeds”• Effective” = 4/5, including “Satisfactory/Exceeds”• Proficient ratings required on rubric to earn Satisfactory

DDOE has attached the document originally utilized for this discussion.

23

|

Review/Feedback Process for 106A/107A

Summer 2013•Progress, Inc. provides report and presentation of DPAS-II Annual Evaluation to the State Board

•Report published on DDOE website

August & October•Initial meetings (8/15, 10/1) with the DPAS-II Review Committee to review survey & system data

•Initial ideas for system improvement discussed with stakeholder groups

November 2013•“Continuous Improvement” report released publically

•DDOE meets with stakeholder groups that request time to discuss the report’s findings, gather feedback

Mar 24 & Apr 7•Additional feedback two forums further narrows the list of proposed changes and provides feedback

•Initial draft of regulations shared•Near-final versions of Regulations shared w/ DSEA/DASA

February 18, 2014•Review by Committee of extensive list of proposed changes to the Regulations/DPAS-II System

•Committee provides feedback

January 16, 2014•Secretary Murphy announces the Delaware’s intent to not utilize “Smarter” as part of C-V based upon inquiries from the field

•DDOE submits proposal to USED for approval

May 1-31 (submitted 4/15)•Regulation published for public comment

May 2•DPAS-II Advisory Committee meets to review all proposed regulatory changes and have opportunity for comment to the State Board

May 15/June 19•Discussion by State Board of Education on regulation

•Action by State Board of Education on regulation

|

106 and 107: Final Proposed Revisions After Feedback

25

|

1.0 Effective Date

The proposed amended regulation will be in effect beginning with the 2014-15 school year

References 1270(f) of Delaware code regarding alternative evaluation systems

|

2.0 DefinitionsStudent Achievement- Modified to clarify the use of data from state assessments in educator evaluations (notably during the first year of implementation of “Smarter”)

“DCAS Teacher” becomes “Group 1 Teacher”

Announced/Unannounced Observation- Changes the name of the “form” that can be utilized- Additional work needed in Guide to explain change

Short Observation (added)

Summative Evaluation can include “Short Observation”

Satisfactory/Unsatisfactory Evaluation- “Needs Improvement” no longer Satisfactory rating for licensure

|

5.0 Appraisal Components and Appraisal Criteria

Appraisal criteria modified to allow for “substitutions” to the Professional Responsibilities Component (5.1.4)

“A school district or charter school may substitute a locally determined alternative Appraisal Component, which must be approved by the Department no later than the last day of July of each year.”

|

Proposed Revisions to 106A – Impact on Ratings for Licensure

How do these revisions align with current expectations and expectations moving forward?

In the proposed revisions to 106A, “Needs Improvement” is no longer a Satisfactory rating for the purpose of receiving a continuing license.

This revision is in alignment with the current Pattern of Ineffective Teaching for an Experienced teacher. It will mean a teacher must have a Satisfactory summative rating in two out of three of their first years to be eligible for a continuing license.

|

106A Revisions: Alignment

In the current state: rows 2, 3 and 5 would all allow a novice teacher to receive a continuing license

In the future, these will not allow a teacher to receive a continuing license.

• In the future, as proposed, a Novice must receive Effective Summative ratings in 2 out of 3 years to receive a continuing license.

• The minimum performance required to meet an Effective rating is Satisfactory on 2 out of 4 in Component I-IV and Satisfactory on Component-V .

• Component V: To not make Satisfactory in Component V (and therefore not Effective), means that BOTH Measures of student growth were Unsatisfactory, or at least one was Unsatisfactory but the administrator did not feel it appropriate to use their discretion to raise the rating to Satisfactory in Component-V.

|

The Impact of C-V “Unsatisfactory” On Our Students

31

Unsatisfactory Satisfactory Exceeds

44.6

55.4

74.3

Student's average DCAS Math Fall 2012 to Spring 2013 scale score growth

by Teacher of Record's Measure A Rating

On average, students on rosters of a math teacher rated “Exceeds” or “Highly-Effective” grew by nearly 30 more points than those in classes with a teacher rated “Unsatisfactory” between fall 2012 and Spring 2013.

|

DPAS-II: COMPONENT RATING ROLL-UP

Components I-IV:Observation

1. Planning and Preparation

2. Classroom Environment

3. Instruction

4. Professional Responsibilities

Component V:Student Improvement

First measure of student growth (50%)

Second measure of student growth (50%)

Satisfactory Unsatisfactory Exceeds Satisfactory Unsatisfactory

|

COMPONENT V: GROUPS, MEASURES & DISCRETION

MEASURE AGrowth targets are based on DCAS instructional scale scores and

student growth targets, which are provided by the DDOE.

Exceeds Satisfactory Unsatisfactory (discretion) Unsatisfactory

65% or more of a teacher’s DCAS student growth targets are met.

50-64% of a teacher’s DCAS student growth targets are met.

35-49% of a teacher’s DCAS student growth targets are met. Administrator could upgrade to a “Satisfactory” rating.

Less than 35% of a teacher’s DCAS student growth targets are met.

MEASURE B

Growth targets are based on internal assessments developed by educators or external measures approved by DDOE. Targets are set

at a conference with the administrator in the fall.

MEASURE CGrowth goals are educator-developed and DDOE-approved;

specific to content areas and job assignments.

Exceeds Satisfactory Unsatisfactory

The agreed upon “exceeds” target is met or surpassed.

The agreed upon “satisfactory” target is met or surpassed, but the “exceeds”target is not met.

The agreed upon“satisfactory” target is not met.

Group 1Instructors of >9 students teaching reading or math in grades 3-10

A 50%

B 50%

Group 2Instructors of >9 students in grades and subjects other than DCAS reading/math for whom a Measure B is available

B 50%

C 50%

Group 3Any educator who does not meet the criteria for Group 1 or Group 2

C 100%

|

Summative Ratings ChartOverall Summative Ratings: Highly Effective, Effective, Needs Improvement, Ineffective

|

Pattern of Ineffective Teaching: Current State

|

Proposed Revisions to 108A – An Improved Process…

How do these revisions align expectations with administrator ratings?

In contrast to 106A, proposed revisions to 108A create alignment of rigor and terminology for all levels of administrators and their evaluations.

In proposed changes to 108A, there are four ratings at every level (criterion, component, summative): Highly Effective, Effective, Needs Improvement, Ineffective. The ratings are the same for Novice and Experienced Administrators.

|

Components I-IV: Leadership Practice

Student Improvement

Highly Effective E or HE on all four Exceeds

EffectiveE or HE on three

+No I’s

Satisfactory(or higher)

Needs ImprovementE or HE on one or two

+Fewer than 3 I’s

Satisfactory(or higher)

Needs Improvement E or HE on three Unsatisfactory

Ineffective E or HE on zero, one or two Unsatisfactory

Ineffective E or HE on zero Satisfactory(or higher)

Ineffective 3 or more I’s Any rating

Four-level System

|

Examples

Example of Effective Administrator

A third-year principal of a struggling elementary school is Highly Effective in her Teaching and Learning practice, showing strength across the board in this area of her practice and effectively building the capacity of others. She is also Effective in Vision and Goals and in Professional Responsibilities, but she Needs Improvement in her Management of People and Operations.

This principal’s school improved enough to give her a rating of Satisfactory on Student Improvement.

She will be Highly Effective next year if…1. She maintains all of her currently Effective and

Highly Effective Ratings; AND2. She moves her Management of People and

Operations up to Effective; AND3. She Exceeds her Student Improvement targets

Her overall rating is… Effective.

Example of Administrator Who Needs Improvement

A third-year principal of another struggling elementary school is Effective in his Management of People and Operations, with a smooth-running and orderly building. However, he Needs Improvement in his Teaching and Learning practice, and Professional Responsibilities. Further, he lacks a clear vision for improving the school, so he is Ineffective on Vision and Goals.

This principal’s school improved enough to give him a rating of Satisfactory on Student Improvement.

He will be deemed Ineffective next year if…1. He has Unsatisfactory Student Improvement; OR2. His Management drops to Needs Improvement;

OR3. His Teaching and Learning and Professional

Responsibilities drop to Ineffective.

His overall rating is… Needs Improvement.

|

Why Strong Evaluation Matters

39

|

From Monitoring: “Improved Process = Improved Progress”

40

Bayard Middle School serves a high-needs population in Wilmington. In 2012-2013, only 38% of children were proficient in ELA. After a bumpy Year 1 of DPAS-II implementation, Bayard started Year 2 with a renewed focus on using the DPAS-II process to focus on student success. As one teacher told DDOE in its visit:

Because of the DPAS-II process, teachers are working together more closely – and children at Bayard are getting special focus on targeted areas of reading. In preliminary results in Spring 2014, proficiency in ELA rose to 42%.

“We knew the administrators would be grading with a more detailed rubric, so we really focused on what the rubric said was

good practice”

“Coming into this year, process is streamlined so we could make our implementation more advanced, can add more reflection, go more in

depth with data, and we understand exactly what we have to do”

“We met as a team and looked each persons student data to see what kids are doing and why, and we saw trends across

our classes on areas where kids needed help learning. “ “We could focus our goals on what the kids needed”

|

A Strong Evaluation System, Implemented Well, Matters: “A Tale of Two High-Need Schools”

41

School A School BSchool leadership did not articulate goals set for educators in Group 1, 2, and 3. Teacher stated that she did not know what her targets were for the 2012-2013 school year.

Administrator choices on teacher ratings did not appear to be grounded in specific or credible rationale.

Building leader stated that they really didn’t know what they were doing.

It was unclear how PLCs drive improvement.

Student Achievement Data: Reading – targets met 43%, Math – targets met 37%

School leadership clearly articulates rigorous goals and it is clear that student achievement is at the forefront of goal-setting.

Comments within evaluations consistently drive student achievement and give specific examples

School leadership stated repeatedly that the focus is always “what’s best for kids.” Rigor is the highest priority for the building.

It was clear that student achievement and prof. development are the focus of PLCs.

Student Achievement Data: Reading – targets met 57%, Math – targets met 44%