1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data...

download 1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007.

If you can't read please download the document

description

3 Looking at the BIG Picture

Transcript of 1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data...

1 Using Data to Improve Student Achievement & to Close the Achievement Gap Tips & Tools for Data Analysis Spring 2007 ATTITUDES BACKGROUND KNOWLEDGE EXPERIENCES BELIEFS SOCIOECONOMIC STATUS LIVING SITUATION/ FAMILY STRUCTURE/ FAMILY SIZE MOBILITY SPECIAL NEEDS TRANSPORTATION LANGUAGE FLUENCY BEFORE/AFTER SCHOOL ACTIVITIES/DUTIES PRIOR SUCCESS/ FAILURE PHYSICAL, MENTAL, & SOCIAL HEALTH 3 Looking at the BIG Picture 4 Step DDDM Process 5 Multiple Measures Demographics Enrollment, attendance, drop-out rate, ethnicity, gender, grade level Perceptions Perceptions of learning environment, values & beliefs, attitudes, observations Student Learning Standardized tests (NRT/CRT), teacher observations of abilities, authentic assessments School Processes Description of school programs & processes 6 Criterion-Referenced Data Whats required? Proficiency percentages for combined pop. & identifiable subgroups by Test Year (for latest 3 years) Analysis of test by Passage type & type of response for literacy Writing domain & multiple choice for literacy Strand & type of response for math in order to identify trends and draw conclusions based on results over 3 year period 7 Norm-Referenced Data Whats required? National percentile rank & standard score for combined population & identifiable subgroups by Test Year Analysis of test by Content subskill & skill cluster in order to identify trends, measure growth, and draw conclusions based on results over 2 year period 8 Disaggregated Data Tools CRT ACSIP Template: # and % of students non- proficient/proficient for combined and subgroup populations ACSIP Strand Performance Report: combined and subgroup performance averages by test, passage type/domain/strand, & type of response Data Analysis Set: DATA SUMMARY REPORT BENCHMARK RESULTS MATH BENCHMARK RESULTS COMBINED POPULATION GRADE LEVEL#NP#P# #P# #P EOC ALGEBRA EOC GEOMETRY KEY: # = actual number of students NP = percentage of non-proficient students P = percentage of proficient & advanced students 4 TH GRADE MATH BENCHMARK RESULTS SUB-GROUP #NP#P# #P# #P AFRICAN-AMERICAN CAUCASIAN HISPANIC SPECIAL SERVICES ECONOMICALLY DISAD ELL KEY: # = actual number of students NP = percentage of non-proficient students P = percentage of proficient & advanced students COMBINED POPULATION LITERARYCONTENTPRACTICAL M/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/R CAUCASIAN LITERARYCONTENTPRACTICAL M/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/R SPECIAL SERVICES LITERARYCONTENTPRACTICAL M/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/R ECONOMICALLY DISADVANTAGED LITERARYCONTENTPRACTICAL M/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/RM/CO/R 12 Disaggregated Data Tools NRT ITBS ACSIP Report: # & % of students performing above the 50 th percentile on each test and content subskill for combined & subgroup populations Performance Profile: standard score & NPR on each test and content subskill for combined population School Coded Summary: standard score & NPR on each test for subgroup populations Data Analysis Set: NRT Growth & Assessment SS Standard Scores: Show relative development over time 14 15 16 17 18 20 SLE Analysis SLEITEM #PERCENTAGE CORRECT TOTAL 83.7 Total the percentage correct column & divide by 300 (in this case). SLE Analysis Percentage Meeting StandardSuggested Action to be Taken 0-34% Align curriculum & classroom instruction; curriculum has not been taught or does not exist; indication that instruction is textbook-driven 35-49% Coordinate curriculum objectives across grade levels & subject areas making sure all objectives are taught (horizontal/vertical alignment) 50-69% Implement high-yield instructional strategies in all classrooms; there is probably a high percentage of lecture, whole-group, & direct teaching 70-84% Spend more quality time on instructional strategies to yield greater results; check learning minutes in schedule & nature of tasks on which students spend their time % Provide aligned enrichment; add depth & breadth; review pacing; reteach for mastery; be sure distributed practice is occurring Source: Learning 24/7 22 Digging Deeper CRT Item Analysis Content Standard Language of Question Level of Questioning Distracters 23 Content Standard What is it that the student must know or be able to do? When is this introduced in the curriculum? How is it paced? Is it a power standard ? What instructional strategies are used to help students master this standard? Have I given students the tools (e.g. calculator skills, writing tips, test taking skills, etc.) necessary to respond appropriately? Can this standard easily be integrated into other curricular areas? 24 Language of Question How is the question worded on the test? Are there vocabulary words used that may hinder comprehension? Do I teach and test using the same language? Do I have word/learning walls in my content area to support this standard and related vocabulary? 25 Level of Questioning According to Bloom s, what is the level of questioning used to measure mastery of the standard? Highlight the verb(s) in the question. Do I use those same verbs in my teaching and testing? Have I taught key or clue words that will help students to understand what is being asked of them? Is the question multi-layered ? 26 Distracters Are there items that distract the student from identifying what is being asked, or are there items that may confuse the student as he/she makes an answer choice? Labels Additional information Multi-layered tasks Conversions Not SLE Correlation: NPO 1.3 (prior to 2004 revisions) which states Apply and master counting, grouping, place value, and estimation. Item Analysis: -What must the student know or be able to do? Content Standard -How is the question worded on the test? Language of the Question -According to Blooms, what is the level of questioning used to measure mastery of the standard? Level of Questioning -Are there items that distract the student from identifying what is being asked, or are there items that may confuse the student as he/she makes an answer choice? Distracters 28 29 Digging Deeper NRT Item Analysis Building Item Analysis Identify items that have a negative value of 10 or more as indicated by the bar falling to the left of the 0 mark Analyze results of all related items 30 31 Peeling the Data: Levels of Looking at Data District K-12 Feeder Patterns School Levels Grade Level Programs & Tracks Classroom-teacher Student Data analysis should not be about just gathering data. It is very easy to get analysis paralysis by spending time pulling data together and not spending time using the data. -Bernhardt, 2004, p. 19 Peeling the Data: Questions to Ask Are there any patterns by racial/ethnic groups? by gender? by other identifiers? What groups are doing well? What groups are behind? What groups are on target? Ahead? What access and equity issues are raised? Do the data surprise you, or do they confirm your perceptions? How might some school or classroom practices contribute to successes and failures? For which groups of students? How do we continue doing whats working and address whats not working for students? Peeling the Data: Dialogue to Have How is student performance described? (by medians, quartiles, levels of proficiency, etc.) How are different groups performing? Which groups are meeting the targeted goals? What dont the data tell you? What other data do you need? What groups might we need to talk to? (students, teachers) What are the implications for? Developing or revising policies Revising practices and strategies Reading literature Visiting other schools Revising, eliminating, adding programs Dialogues with experts Professional development goal setting and monitoring progress How do we share and present the data to various audiences? Sample Questions from a Schools Data Team Are there patterns of achievement based on Benchmark scores within subgroups? Are there patterns of placement for special programs by ethnicity, gender, etc.? What trends do we see with students who have entered our school early in their education vs. later? Is there a relationship between number of years at our school and our Benchmark scores? Sample Questions from a Schools Data Team Is there a relationship between attendance/tardiness and achievement? How do students who have been retained do later? How do our elementary students do in middle school? Do findings in our NRT results support findings in our CRT results? Can our findings be directly linked to curriculum? instruction? assessment? What are our next steps? Making It Personal for Teachers Teachers can use their own data to Identify the strengths of their own students Identify the challenges of their own students Identify common misconceptions & error patterns Identify their own successful teaching methods Pinpoint areas needed for professional development Making It Personal for Students Students can use their own data to Reflect on their own knowledge & test taking strategies Reflect on their strengths & weaknesses Set goals for improvement Necessary Variables for Data-Driven Decision- Making CULTURAL Change ACSIP PLANNING/FUNDING SOURCES KNOW HOW TIME WANT TO LEADERSHIP SUCCESS 43 Candie Watts Arch Ford Education Service Cooperative