A306 Session 8: Digging into Data PART II Mar. 11, 2008.

31
A306 Session 8: Digging into Data PART II Mar. 11, 2008

Transcript of A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Page 1: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

A306

Session 8:

Digging into Data

PART II

Mar. 11, 2008

Page 2: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Plan for Today

4:10-4:20 Data Wise in Action

4:20-5:05 The Emerson School in Action

5:05-5:40 Standards in Practice ProtocolSarah Fiarman

5:40-5:50 Break

5:50-6:30 Standards in Practice, continued

6:30-7:00 Team Time, Meet with Teaching Fellow

Page 3: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Today’s Objectives1. Dig deeper into digging into data and

appreciate how central this work is to using data wisely.

2. Get ideas for how to collect and analyze formative data.

3. Experience a protocol for exploring the quality of student assignments.

Page 4: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Spring Plan Challenges

Page 5: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

East Boston Early Education Center Creating individual action plans for children

scoring below 10 on the DRA assessment

Bringing in doctoral students from BC to work 4 hours/week supporting advanced reading in 1st grade, thus allowing teachers to concentrate on emergent readers.

Looking at student work once a month for writing and math.

Page 6: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Emerson Evaluating the performance of ELL students

enrolled in SEI and regular classrooms.

Categorizing the academic performance of individual students to develop interventions to meet the needs of students performing at a variety of levels.

Creating data collection templates and PowerPoint presentation templates to help teachers collect and evaluate student data and provide the data team with a common format for presenting school-wide data to the faculty.

Page 7: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Hennigan Working with literacy specialists on strategies to

improve vocabulary across disciplines.

Creating a knowledge management system to capture the work that has been leading up to using student data to drive classroom instruction and to benchmark the school's academic performance.

Focusing on culturally sensitive teaching throughout the school.

Page 8: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Sumner Building capacity of MLT as pilot for rest of school.

Analyzing MCAS and 2007 BPS Math Assessment; found students are using increasingly efficient strategies but also that 1/3 to 1/2 “lost track of their work.”

Looking at BPS Math OR questions to identify what it specifically meant to “lose track of the work,” found that students did not understand the meaning of the numbers in the procedures they used and/or made simple addition mistakes.

Analyzing why this is happening to our students.

Page 9: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Data Wise at the R.W.Emerson

C. Sura O’Mard-Gentle, Principal Created by Emerson Data Wise Group: Maria Fenwick,

Johanna Schaefer, and Marcia Russell

Presented by Emerson Data TeamMarch 11, 2008

Page 10: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

History March 2007 Data Overview Presentation

In-school PD focused on MCAS, including: Overall performance by grade level Content and Format of the tests Student performance by Item Type and Content Area Deeper analysis of selected test items, including sample

student work

April 2007 Data Analysis Workshop In-school PD for teachers grades 3-5

How to use MyBPS and DOE websites to access data, test items, and sample student work

August 2007 MCAS Overview Summer Staff Meeting focused on preliminary scores

on a grade and classroom level

Page 11: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Example from March 2007: Grade 4

Std 183% Std 4

10%

Std 58%

Std 817%

Std 105%Std 12

24%

Std 165%

Std 143%

Std 158%

Std 1317%

Which standards are emphasized the most? The least?

Standard 4

Vocabulary & Concept Development

Standard 5

Structure & Origins of Modern English

Standard 8

Understanding a Text

Standard 12

Fiction

Standard 13

Nonfiction

Standard 15

Style & Language

Page 12: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Example from March 2007 : Grade 4 How did students perform by question type?

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

MC OR

Question Type

Sta

nd

ard

ize

d

Av

era

ge

Sc

ore

Page 13: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

History March 2007 Data Overview Presentation

In-school PD focused on MCAS, including: Overall performance by grade level Content and Format of the tests Student performance by Item Type and Content Area Deeper analysis of selected test items, including sample

student work

April 2007 Data Analysis Workshop In-school PD for teachers grades 3-5

How to use MyBPS and DOE websites to access data, test items, and sample student work

August 2007 MCAS Overview Summer Staff Meeting focused on preliminary scores

on a grade and classroom level

Page 14: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Example from April 2007 Where do I go if I want to…

Find out which of my students were in each category (W, NI, P, A)? myBPS

Find students from my class who were close to passing on last year’s test?

myBPS

See how the students I taught last year performed on last year’s test?

myBPS

See a graph that shows how my students did on each question? myBPS

Look at my students’ short answer and open response scores? myBPS

Compare how my students scored on individual questions to the district or state?

myBPS

Compare how Emerson students’ averages on individual questions to the district or state?

DOE

Look at an entire Emerson grade’s averages on multiple choice questions?

DOE

Look at an entire Emerson grade’s averages on short answer or open response questions?

DOE

Find actual questions so I can print them? myBPS or DOE

Make a packet of old questions grouped by a certain subject/category?

DOE

Look at scoring rubrics for Open Response questions? DOE

See sample student work in each category for Open Response questions?

DOE

Page 15: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

History March 2007 Data Overview Presentation

In-school PD focused on MCAS, including: Overall performance by grade level Content and Format of the tests Student performance by Item Type and Content Area Deeper analysis of selected test items, including sample

student work

April 2007 Data Analysis Workshop In-school PD for teachers grades 3-5

How to use MyBPS and DOE websites to access data, test items, and sample student work

August 2007 MCAS Overview Summer Staff Meeting focused on preliminary scores

on a grade and classroom level

Page 16: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

2006 and 2007 Comparison

27

57

17

0

59

2615

00

20

40

60

80

100

W NI P AScore Level

% o

f S

tud

en

ts

20062007

Example from Summer 2007

Look closely at the W category and the NI category. What do you notice?

Page 17: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

This Year Data Team is a consistent focus at our school

Meet twice per month Incorporate Data Wise protocols; Focus on Data Wise

process Fall 2007 Workshops - Building Assessment Literacy

“ABCs of AYP” Data Use and Misuse Scenarios

December 2007 WSIP Established specific targets for improvement Collaborated with ILT, LAT, and MLT to complete goals

January 2008 Data Calendar Created 12 month calendar to be a timeline for future years

Page 18: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Example from Fall 2007 Each school’s individual student scores are assigned a

point value. The points are then averaged to find the CPI, or Composite Performance Index.

Advanced 100 points

Proficient 100 points

High Needs Improvement

75 points

Low Needs Improvement 50 points

High Warning 25 points

Warning 0 points

Page 19: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

MCAS LOW NEEDS IMPROVEMENT

Student Name Grade

Rete

nti

on? Subject Scores

Benchm

ark

Success

? N

am

e T

est

s

Fam

ily I

nvolv

em

en

t

Befo

re/A

fter

School

Sm

all

Gro

up L

itera

cy

10 B

oys

Nati

ve L

angu

age

Lit

era

cy/E

SL

Gir

ls S

upport

Gro

up

MC

AS T

uto

ring

SE

S T

uto

ring

Genera

tion

s/ B

ost

on

Part

ners

Indiv

idual Stu

dent

Support

Reso

urc

e R

oom

Example from Fall 2007

Page 20: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

This Year Data Team is a consistent focus at our school

Meet twice per month Incorporate Data Wise protocols; Focus on Data Wise

process Fall 2007 Workshops - Building Assessment Literacy

“ABCs of AYP” Data Use and Misuse Scenarios

December 2007 WSIP Established specific targets for improvement Collaborated with ILT, LAT, and MLT to complete goals

January 2008 Data Calendar Created 12 month calendar to be a timeline for future years

Page 21: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

August *MCAS data available *Conduct MCAS preliminary analysis

September *GRADE administered *SAT-9 administered *Fall writing prompt *Present preliminary MCAS analysis to staff

October *Upload MCAS data to TestWiz.net *GRADE data available *Analyze GRADE data *I dentif y at-risk students *MEPA/ MELA-O

November *Progress Reports f or at-risk students *I SSPs *Teachers conduct analysis of individual students using MyBPS *Begin MCAS item analysis *MEPA results f rom previous year available

December *Report Cards *Present MCAS item analysis to I LT/ staff *I LT determines instructional strategies based on available data

J anuary *SAT-9 results available *Mid-Year assessments: math, writing *MCAS tutoring begins *I LT monitors and supports implementation of instructional strategies

February *Student Learning Contracts *Progress Reports *FLEPs must be identifi ed

March *MCAS ELA administered *Report Cards *MEPA administered

April *Data Team chooses one template to create or improve and/ or a targeted question to analyze using available data

May *MCAS math, science, social studies administered

J une *End of year assessments: math, writing *Report Cards *Teachers complete cumulative f olders

J uly *Mentally prepare f or upcoming eleven months of never-ending work

R. W. Emerson

DRAFT

Data Calendar

Page 22: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Currently Focusing on ELL student performance

Our school is roughly 50% SEI (Cape Verdean Creole) classrooms, 50% monolingual regular ed

ELL students without designations are found also in monolingual classrooms

At least 20 SIFE (Students with Interrupted Formal Schooling) students

Transient classrooms (over 20% transience rate)

Created templates for teachers to track and analyze their data

Teachers have had Data Binders since 2006 Needed a way to formalize school-based assessment data Templates include question packets that prompt teachers to

analyze their own data

Page 23: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Challenges Time: The perpetual obstacle

Using Professional Leadership Project funds, 3 teachers work together on data collection and analysis for 2.5 hours per week

Those teachers can visit other staff members during planning blocks

Larger group of teachers dedicate their time to Data Team meetings two hours per month before school

Creative use of student teachers and substitutes has allowed us to host in-school workshops

Technology All templates are currently available in both electronic and

paper-and-pencil format Hope to inspire people to use electronic with conditional

formatting and new laptops Accountability

Teachers are supported by Data Team and are asked to turn their data in to the Data Team

Page 24: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

ELL/SEI Focus Identified Achievement Gap between SEI and

monolingual regular ed classrooms Understandings

When newcomers from a non-English speaking background and/or SIFE students take the MCAS, we expect there to be a lag in scores, at least initially

Current district assessments for newcomers have not been adequate

Actions Restructured our SEI program

Reallocating resources to meet the needs of our students Classification of “Warning,” is not helpful - Identified high- and

low-scoring students within Warning and Needs Improvement Look beyond just MCAS data to find progress - Use mid-year

assessments as a more meaningful measure

Page 25: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Data Templates Reading Open Response Assessment

Based on Harcourt Trophies story Scored using Emerson Open Response Rubric (4-point

scale) Writing Prompt

Based on genre, depending on grade level Scored using 6 Traits Rubric adapted by Emerson

teachers Math BPS District-wide Mid-Year Assessment Science Mid-Year Assessment

Created by Emerson science teachers Based on old MCAS items and teacher-created items Format and scoring similar to math assessment

Page 26: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Mid-Year Open Response TemplateMid-Year Reading Open Response AnalysisName: Maria Fenwick Grade: 4 Name: Maria FenwickDate: 2/21/08 Number of Students: 20 Date: 2/21/08 Number of Students: 20

Student Score Weaknesses Strengths Student IdeasOrganiza-tion Voice

Word Choice

Sentence Fluency

Conven-tions

Overall Score

JS 2 Clarity Evidence; Organization JS 1 2 2 1 1 2 1.5

JF 2 Quality of evidence Organization; Connecting JF 2 2 2 3 2 2 2.2

TA 2 Clarity; Quality of writing Evidence; some organization TA #DIV/0!

Ddo 3 Clear connection to question Organization; Quality evidence Ddo 3 2 3 3 3 3 2.8

DL 3 Quailty of evidence Clarity; Organization; Connection DL 2 2 3 3 2 2 2.3

VS 2 Organization; Clarity Used evidence VS 2 2 2 3 2 1 2.0

KE 2 Lack of details Evidence; Organization KE 2 2 2 2 2 2 2.0

TB 2 Clarity; Organization Evidence; attempt to connect TB 2 1 2 2 2 2 1.8

JG 3 Quality of evidence Organization; Clarity; Citations JG 2 3 3 3 3 2 2.7

AM 2 Clarity; Quailty of evidence Organization; attempt to connect AM 2 2 2 2 2 2 2.0

Dde 2 Organization; level of explanation Quality evidence Dde #DIV/0!

RL 1 Effort; No explanation Some evidence RL 1 1 1 1 1 2 1.2

JA 1 Did not answer question Attempted to summarize JA 1 1 1 2 1 1 1.2

AH 2 Only 2 examples Details; Quality evidence AH 3 3 3 3 3 3 3.0

VN 3 Quailty of evidence Organization; Clarity; Citations VN 2 2 3 3 3 2 2.5

JH 3 Introduction Organization; Connecting JH 3 3 3 3 3 3 3.0

EF 3 Clear connection to question Organization; Evidence; Citations EF #DIV/0!

JB JB 2 1 2 2 2 1 1.7

DT DT 3 2 3 3 3 3 2.8

NB NB 2 2 2 3 1 2 2.0

#DIV/0!

#DIV/0!

#DIV/0!

#DIV/0!

#DIV/0!

#DIV/0!

Average: 2.1 1.9 2.3 2.5 2.1 2.1 2.2Class Average: 2.2

Mid-Year Writing Prompt AnalysisGrade: 4Assessment Details:

Open Response Essay data is collected based on a standardized response question from a Harcourt Trophies anthology story.

Essays are scored based on an Emerson School rubric with 4 points. “Strength” and “Weakness” boxes are for teacher input.

Conditional Formatting and Formulas:

Automatically highlights by score; finds overall class average.

Page 27: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Mid-Year Writing Prompt TemplateMid-Year Reading Open Response AnalysisName: Maria Fenwick Grade: 4 Name: Maria FenwickDate: 2/21/08 Number of Students: 20 Date: 2/21/08 Number of Students: 20

Student Score Weaknesses Strengths Student IdeasOrganiza-tion Voice

Word Choice

Sentence Fluency

Conven-tions

Overall Score

JS 2 Clarity Evidence; Organization JS 1 2 2 1 1 2 1.5

JF 2 Quality of evidence Organization; Connecting JF 2 2 2 3 2 2 2.2

TA 2 Clarity; Quality of writing Evidence; some organization TA #DIV/0!

Ddo 3 Clear connection to question Organization; Quality evidence Ddo 3 2 3 3 3 3 2.8

DL 3 Quailty of evidence Clarity; Organization; Connection DL 2 2 3 3 2 2 2.3

VS 2 Organization; Clarity Used evidence VS 2 2 2 3 2 1 2.0

KE 2 Lack of details Evidence; Organization KE 2 2 2 2 2 2 2.0

TB 2 Clarity; Organization Evidence; attempt to connect TB 2 1 2 2 2 2 1.8

JG 3 Quality of evidence Organization; Clarity; Citations JG 2 3 3 3 3 2 2.7

AM 2 Clarity; Quailty of evidence Organization; attempt to connect AM 2 2 2 2 2 2 2.0

Dde 2 Organization; level of explanation Quality evidence Dde #DIV/0!

RL 1 Effort; No explanation Some evidence RL 1 1 1 1 1 2 1.2

JA 1 Did not answer question Attempted to summarize JA 1 1 1 2 1 1 1.2

AH 2 Only 2 examples Details; Quality evidence AH 3 3 3 3 3 3 3.0

VN 3 Quailty of evidence Organization; Clarity; Citations VN 2 2 3 3 3 2 2.5

JH 3 Introduction Organization; Connecting JH 3 3 3 3 3 3 3.0

EF 3 Clear connection to question Organization; Evidence; Citations EF #DIV/0!

JB JB 2 1 2 2 2 1 1.7

DT DT 3 2 3 3 3 3 2.8

NB NB 2 2 2 3 1 2 2.0

#DIV/0!

#DIV/0!

#DIV/0!

#DIV/0!

#DIV/0!

#DIV/0!

Average: 2.1 1.9 2.3 2.5 2.1 2.1 2.2Class Average: 2.2

Mid-Year Writing Prompt AnalysisGrade: 4Assessment Details:

Writing prompt is a standardized prompt given to students three times per year. In fourth grade, students write a personal narrative story.

Students are scored using a version of the 6 Traits rubric that was adapted by Emerson teachers.

Conditional Formatting and Formulas:

Automatically highlights by score; finds student averages and class averages by trait.

Page 28: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Math Mid-Year Assessment TemplateName: Maria Fenwick Name: Date:

Grade: 4 20 Grade:

Type MC MC MC MC MC MC MC MC MC SA SA SA SA SA SA SA SA OR OR Total Points Type SA SA SA SA SA SA SA SA SA SA SA SA OR ORName 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 Total Score Student 1 2 3 4 5 6 7 8 9 10 11 12 13 14 Total

JS 1 0 0 1 0 0 0 1 0 0 0 1 0 0 1 0 0 1 1 7 1

JF 1 1 1 1 1 1 0 1 1 1 0 1 1 1 1 1 1 1 3 19 2

Dde 1 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 1 3 2 20 3

DL 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 2 3 20 3

JG 0 1 1 1 1 1 0 1 1 1 1 0 1 1 1 1 1 3 3 20 3

EF 0 1 1 1 0 1 1 1 1 0 1 1 1 1 0 1 0 3 3 18 2

AH 0 0 1 1 1 0 1 1 1 1 0 0 1 0 0 0 1 1 1 11 1

JH 1 0 1 1 1 0 1 1 1 1 1 1 1 1 1 0 1 2 2 18 2

TB 0 1 1 1 0 1 0 1 0 1 1 1 1 0 0 1 0 2 1 13 1

VN 0 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 2 3 20 3

JA 1 0 0 1 0 0 1 0 0 1 0 0 0 0 0 0 0 0 0 4 1

DT 1 0 1 1 1 1 0 1 1 1 1 1 1 1 1 1 0 2 3 19 2

Ddo 1 0 1 1 1 0 1 1 1 0 1 1 1 1 1 0 0 1 2 15 2

KE 0 0 1 1 1 1 0 1 1 1 1 1 1 1 0 1 1 2 3 18 2

RL 0 0 1 1 1 0 1 1 0 0 0 0 1 1 1 1 0 2 1 12 1

VS 0 0 1 1 0 0 1 1 0 1 0 1 1 0 1 1 0 2 1 12 1

JB 0 0 1 0 0 0 1 0 0 0 0 1 0 0 0 1 0 1 2 7 1

AM 1 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 3 2 21 3

NB 1 1 0 1 1 0 1 1 0 1 1 1 1 1 0 1 1 3 3 19 2

TA 0 0 1 1 1 0 1 1 0 1 1 0 1 1 0 0 0 1 2 12 1

0

0

0

0

0

Total 9 8 17 19 14 9 14 18 12 15 13 15 17 12 12 14 9 2 2 15.3 1.85 Total

% 45%

40%

85%

95%

70%

45%

70%

90%

60%

75%

65%

75%

85%

60%

60%

70%

45%

Ca

teg

ory

NS

SP

NS

PR PR GE

SP

PR ME

NS

PR ME

PR PR ME

NS

SP

Ca

teg

ory

2/22/08

Total # of Students: Assessment Details:

Based on BPS district-wide math midyear assessment.

Conditional Formatting and Formulas:

Multiple Choice & Short Answer Item Analysis: Automatically highlights correct answers; finds total correct and percentage of students answering correctly.

Open Response: Automatically highlights by score - Red = 1, Orange = 2, Green = 3 and 4; finds class averages.

Total Points/Total Score: Automatically highlights point values in Yellow if they are within 2 points of the next score; finds class averages.

Page 29: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Future Goals Increase participation in classroom-level data analysis School-wide Data Board

Large scale display of student data similar to the Gardner Pilot presentation

Use GRADE data, possibly mid-year Math assessment Support SEI teachers to identify student progress

Look at MEPA Find a way to track progress throughout SEI program Create school-based system for assessing newcomers

Connect Data Team and ILT through Data Summary sheets created from school-wide mid-year data collection

Stay on target for MCAS prep using 12-month calendar

Page 30: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Team TimeEmerson School

TAKE-AWAYS

Standards in Practice

TAKE-AWAYS

Hillsborough(DWIA Ch. 4) TAKE-AWAYS

How will we integrate these ideas in our school?

Page 31: A306 Session 8: Digging into Data PART II Mar. 11, 2008.

Assignments for April 8 Continue working on Spring Plan

Do the Standards in Practice protocol

Read Data Wise in Action ch. 5