Letter to Commissioner ELA 2013

download Letter to Commissioner ELA 2013

of 1

Transcript of Letter to Commissioner ELA 2013

  • 7/28/2019 Letter to Commissioner ELA 2013

    1/1

    June 20, 2013

    Dr. John B. King, Jr.

    New York State Education Department

    89 Washington Avenue

    Albany, New York 12234

    Dear Commissioner King:

    I am an assistant principal at the Bronx Center for Science and Mathematics (BCSM), one of the highest performing non-screened

    high schools in the Bronx and in New York City. I have also taught in this school and served as the chair of the Department of English

    since 2008. In my years at BCSM, the passing rate on the Comprehensive English Regents Exam has hovered close to 90%. In the pasttwo years, however, I have noticed a downward trend and have been examining reasons for this so that I can help teachers better

    prepare students.

    While looking at teaching and the test, I have been particularly struck by two points. First, some of the questions have become more

    rigorous and the focus has begun to shift, which makes sense given the new demands of the Common Core Standards and college

    readiness. Second, while the rubrics over the last several years have remained identical, the conversion chart has shifted making it

    more difficult for students to pass. In the last couple of years I have backtested our scores using the June 2011 chart for comparison;

    while the exact amount differs from top to bottom, there is approximately a four-point difference in the conversion charts between

    June 2011 and June 2013. For example, a student who scored a 23 in the multiple-choice and a 7 in the writing received a 79 in 2011

    but a 75 in 2013; a student who scored a 21 on the multiple-choice and a 5 on the writing passed with a 65 in 2011 but failed with a

    60 in 2013. A significant problem is that these changes in the conversion of scores happened with no corresponding change in the

    rubrics. When the standard has not changed but the bar has without prior notice, how are we to demonstrate to teachers and

    students alike the areas for improvement?

    A specific impact of this is the case of one of our students who, when he first took the exam in June 2011, scored a 66. We worked

    with this student to become college ready and reach the benchmark of 75. He sat again in January 2013 and received a 74. This

    student was determined to make it, was prepped, and sat again this June. Despite having a higher raw score this time, a score that

    would have converted to a 76 in January, this student once again fell short and actually received a lower converted score than he

    previously receivedto reiterate, this is with a higherraw score. How do we explain to the student that the bar shifted without our

    knowledge even though the standard, as explained in the rubrics, had not?

    I understand the test will soon look different. However, for the sake of the students, the teachers, and the schools, all of whom are

    rated based on these exams, the rubrics and the conversion charts must be aligned and consistent, and both should be made

    available when teachers are preparing students, not at the time of the exam. Without this the confusion, frustration and skepticism

    over these standardized tests will grow, and justifiably so.

    Sincerely,

    Stephen Seltzer

    Assistant Principal