011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back...

25

Click here to load reader

Transcript of 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back...

Page 1: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Improving Student Performance in the U.S.: ISLIP

John W. Wick

President, WickPartners, Inc.Professor (emeritus) Northwestern University

ISLIP stands for Individual Student Longitudinal Improvement Plan. Sort of an educationese-title, but nevertheless succinctly descriptive. ISLIP is NOT a “same-old, same-old” approach to improving student performance. Over the years of trial-and-error of development, each of the seven insights (“perspectives” or “methodologies”) developed below were, most properly stated, “stumbled upon.” The first two were learned during a four year university leave to serve as the first (and only) Director of Research and Evaluation for the Chicago Public Schools.

First: Testing’s most important usage should be individual diagnosis to cause student performance improvement. Start with the individual and

work up, not at the mean and work down.

My Chicago Public Schools department was in charge of testing. At that time, the only use of standardized testing was basically to sort, compare, reward and punish, mostly based on defined-group average scores. Statistics was my training but psychometrics my specialty. That psychometric perspective sees individual student growth as causing improved group averages. Starting with the mean and working down ignores too many individuals.

Second: Individual student performance improvement’s primary cause is effort, not ability. “High performer” happens to a student who

presses for higher achievement every single day.

Unfortunately, many people in our country, implicitly, deep in their hearts, believe that smart kids perform well and anyone who performs poorly is therefore dumb. Teachers are just there to hand out the papers. Under this false belief, the concept of a student working his/her way upward from percentile rank 40 in grade 3 to percentile rank 85 in grade 8 is improbable.

Forget “smart” and “dumb.” Think “high performer” and “low performer.” Nothing dramatic; just a relentless, constant push which, little by little, brings the student to his or her highest possible academic performance level.

Third: Getting that “relentless, constant push” needs a much better motivator than just saying, “Please.” As early as third or fourth grade, the

student’s projected performance must be connected to long-term decisions which profoundly impact the student’s high school, college and

adult-life success.

1

Page 2: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Way back August of 1981, the National Commission on Excellence in Education stated a goal which connected “… all children by virtue of their own efforts…” to ”… secure gainful employment.” In a sequence of national efforts to make that happen, a lot of effort and money went into No Child Left Behind. The good news from that many, many students did attain higher performance levels. The bad news: the results from that eighteen year period do not seem to suggest our nation’s graduates have gained much in international comparisons. Most of the gains seen from 1995 go 2012 were at the lower grades and lower performance levels. Performance at the upper levels and by grade 8 was stagnant or down. Why did that happen? Because a rigorous long-term goal, reaching into adulthood, was not set. States were allowed to set their own standards and, for most states, those were at minimum competence levers. See “Student Performance: 1995-2012” posted at wickpartners.com for elaboration.

To address this, the ISLIP model begins with an already-available set of widely-accepted tests with up-to-date, properly collected national norms. The test used has been empirically documented as an accurate predictor the measures schools use for high school placement levels and colleges use for acceptability. In other words, third grade performance must be empirically connected to post-secondary school success. That is what No Child Left Behind ignored.

Fourth: Using a right-answer analysis of test responses to improve student performance will not work. Jettisoning that right-answer for a

wrong-answer approach focuses directly on the student getting less items wrong and thereby earning a higher score and a brighter future.

“Right-answer,” “wrong answer” sound like double talk but a very clear and important distinction is made below.

A reporter once asked Mayor Daley of Chicago, “How come you lost that election?” to which the mayor replied, straight-faced, “Because we did not get as many votes.” In testing, a parallel question, “Why did your son Harold just such a low score?” would have the parallel answer, “Because he got too many answers wrong.”

The second answer is not so silly. Most publisher’s and state agency’s test feedback information concentrates on what the student did not get right instead of what the student did get wrong. Those are not equivalent; here are two examples of the difference.

262 - 158

A. 116B. 104C. 114D. 420

A. C03. One digit easily identified as incorrect (1’s digit must be a 4.)

B. Correct.C. E02. Error could have been avoided with an

estimate (260-160=100).D. C04. Wrong operator. The person choosing

this added instead of subtracting.

2

Page 3: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

If the student gets this item wrong:

One current right-answer approach codes answers into these four categories: Compute with (1) whole numbers, (2) fractions, or (3) decimals plus (4) Algebraic manipulations. This item fits only “Compute with whole numbers.” The right-answer error feedback to the student: “Of the ten items in this category, six answers (60%) were correct.” How in the world can teacher, student, or parent use that information to help this student improve? The domain “Compute with whole numbers” is huge! Diagnostically, the feedback is almost totally useless if the goal is to guide this student toward reducing errors next time around.

For the wrong –answer analysis, each of the three wrong answer has a distinctly different cause, including “one digit could easily identified as incorrect,” “a simple estimate would have avoided this error,” and “used wrong operator.” In the ISLIP approach, each mathematics error is coded into one of twenty-six distinct error categories. Each is a well defined, specific error category begging for direct remediation well within the reach of student, parent, or teacher. These are not broad and diagnostically useless. They are direct and useable.

A second example is from a language test.

Choose the best wording:A. In the Army John enlisted when

he was eighteen years old.B. John enlisted in the Army. Just

eighteen years old!C. When he was eighteen years old,

John is enlisting in the Army.D. When he was eighteen years old,

John enlisted in the Army.

A. E06 Sentence expression errors—clumsy wording

B. E11 Error chosen would lead to a sentence fragment or run-on sentence

C. E24 Verb errors—maintaining same or correct tense.

D. Correct

The wrong-answer analysis coded this into one of twenty-seven distinct and specifically defined error categories in the domain of English-Language Arts. The error report defines each student’s top six error categories. The three possible wrong answers each has a different cause and would require a specific but different remediation approach.

That same right-answer domain system shown for the computation problem would code any wrong written expression answer into these four broad domains: Usage and Grammar, Sentence Structure, Planning and Organization, and Appropriate Expression. The output provides a “percent correct nationally” value, indicating that the domain, of those four, was assigned in advance, based the right answer. The domain would probably have been “Appropriate Expression.” But wrong answer A would be Planning and Organization, wrong answer B would be

3

Page 4: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Sentence Structure, wrong answer C would be Usage and Grammar. If the student gets this item RIGHT, the coding is RIGHT. If the student gets the item WRONG, the coding is WRONG! The right-answer analysis ignored the reason the student got the question wrong! When the right answer report says, “Your lowest domain was Appropriate Expression,” that, once again, does not provide any useable diagnostic feedback directing student, parent, or teacher to correct the errors.

Fifth: Individual error patterns are personally idiosyncratic. Improvement based on correcting errors needs to be addressed one student at a time.

Why?

The school year is fixed. The number of minutes of instruction is fixed. The curriculum is fixed. Each unit is taught within a fixed time boundary. At the end of that unit,

usually after a final unit, the teacher moves on. At that time, most students will not have mastered all the learning

expectations. That which is not learned varies from student to student. The “error baggage” from that unit accumulates with the personal error

baggage from former units. Future units will add to that load of unattained learning expectations for that individual.

The table below shows how widely different the error patterns for students at the same performance are. Just over 250 fifth graders from three widely-separated school districts took the same standardized math test. Of those students, eighteen had exactly the same raw score and percentile rank (percentile rank 73) on the Math test.

The six highest error categories for those eighteen students, all at exactly the same performance level, used up 22 of 26 possible error categories available for Math. Students at the same performance level do NOT have the same error patterns.

Error D04 (statistics) is a common error, appearing in the top six for 16 of the 18 students (89%) but has the third-highest score. C01 and C05, for five and nine students respectively have the highest ranks. Were a teacher to concentrate on the three highest-ranked categories, the D04 remediation would be unnecessary for 11%, the C01 remediation unnecessary for 72% and the C05 remediation unnecessary for 50%.

Look at the tremendous variation. The idea that performance level and error patterns are closely related is not correct. Each student has her or his own personal and idiosyncratic pattern.

4

Page 5: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Error Category Frequency (of 18) Avg. RankE02 Avoidable add/subtract error 1 1.0E03 Avoidable mult./divide error 1 2.0D04 Statistics-probability-data 16 2.1C01 Carry- borrow- rename 4 2.3E05 Avoidable fraction error 10 2.5

C05 Basic fraction operations 8 2.6R04 Relative value errors 2 3.0

C03 1st digit or dec. place wrg. 9 3.3C06 Error long division process 9 3.3

R01 Measurement-unit conversion 6 3.8D02 Direct-read from display 3 4.0

R02 Precision of measurement 2 4.0D01 Error format or directions 6 4.2

C07 Random choice appearance 4 4.3R03 Alternative represent. nbr. 2 4.5G02 Geometry real-world appli. 8 4.9A03 Ratio and proportion pbls. 5 5.0

E01 Number facts 0 to 10 2 5.0A02 Words to number sentence 2 5.0G01 Spatial visualization error 1 5.0A01 Basic pre-algebra skills 6 5.3A04 Pre-algebra past basic 1 6.0

The table below shows the actual error patterns for these eighteen students. While not wanting to be repetitious or redundant, remember these eighteen are all at exactly the same performance level.

In the table below, one student’s highest error category was E02 Avoidable add/subtract error. That is the only student at this same performance level who even had error E02 in his or her top six. One student had E03 as his/her second highest error category although no other student at that performance level did.

No two students have the same pattern. In fact, no two students have the same top six categories! Eight of the eighteen students have different error categories as his or her

personal high error category.

You see, no matter how much the teacher or the principal want to treat student remediation as a group activity, it will not be effective. Throwing away the individual reports and concentrating only on the summary of top six errors for that group will miss just about one-half of the required error correction needed.

5

Page 6: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Stu-dent

Most error

s 2nd 3rd 4th 5th 6th

Rank of

Mostof

2ndof

3rdof

4thof 5th

of 6th

STU01 D04 E05 C01 C03 C05 C06 1 4 2 11 3 5

STU02 D04 R01 R04 A01 A03 A04 1 19 6 22 20 7

STU03 C06 E03 E05 G02 E01 C03 5 8 4 21 13 11

STU04 D04 E05 C05 C07 D02 G02 1 4 3 18 12 21

STU05 E05 C03 C05 C06 E01 D04 4 11 3 5 13 1

STU06 D04 C01 R04 C03 G02 A01 1 2 6 11 21 22

STU07 E05 A03 C06 C07 D01 G02 4 20 5 18 17 21

STU08 D04 D02 C01 C03 R01 C07 1 12 2 11 19 18

STU09 C05 D04 C06 D01 R01 A01 3 1 5 17 19 22

STU10 C03 D04 R01 A01 A02 A03 11 1 19 22 14 20

STU11 C05 D04 G02 D01 D02 E05 3 1 21 17 12 4

STU12 C06 D04 R02 R03 A02 A03 5 1 15 16 14 20

STU13 E02 C05 D04 R01 E05 D01 9 3 1 19 4 17

STU14 D04 E05 C03 G02 R02 A03 1 4 11 21 15 20

STU15 D04 D01 C03 C06 G01 G02 1 17 11 5 10 21

STU16 E05 D04 C06 R01 R03 A01 4 1 5 19 16 22

STU17 D04 C05 C07 D01 G02 A01 1 3 18 17 21 22

STU18 C01 E05 C03 C05 C06 D04 2 4 11 3 5 1

Sixth: Even when provided with a two very specific lists (English-Language Arts and Mathematics) of each student’s errors, most teachers will not willingly and of their own volition accept the challenge of helping

each student correct those errors.

After developing the error analysis approach, the student performance improvement task seemed like it had been solved. Wrong.

No teacher ever said, “No. I have no interest in helping my students perform at a higher level.” No. No one ever said that. Instead, the responses were like this:

6

Page 7: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

“Which part of the curriculum should I no longer cover to take up this new expectation?”

“This is putting much too much emphasis on test scores.”“Constantly pressing students toward higher achievement will lead

to a lot of stress.”“The parents will not want this much pressure put on test scores.”“High school and college are the places for performance stress.

Elementary and middle school should have less performance pressure as social and personal skills are developed.”

“Come on! Let the kids have fun. They can make their own decisions when they get to high school.”

And on and on. Never did any directly say, “I don’t want to take the time to help student.” More acceptable sounding rationales worked just as well.

As an educator, a former high school teacher, a longtime college professor, married to a teacher and with a significant list of teacher friends, that was hard to accept. With this background, it took ten years to accept that teacher cooperation was the bottleneck. Finally, this realization hit: Before this valuable error information will ever have an opportunity to be used by the student and parent, one of these two things must occur:

A method must be developed to go around the teachers, feeding this error information directly to the student or the student’s parents.

or: Coerce the teachers to participate because participation is the only way

they can receive a positive evaluation.

The ISLIP model has done the first, developing on-line tutorials for students, their parents or a tutor to use. The tutorials are free. Teachers, of course, can use them as well—for free! The student or parent can open the tutorial and find the section on each of her or his top error categories. The tutorial concentrates on why these mistakes are made and provides example tests to ensure the errors were corrected.

The second approach uses “coerce,” a very strong word. The teachers do have a long-standing defense woven into the very fabric of schooling. The fabric of schooling includes these four levels:

First, a school needs a curriculum. A curriculum is a school plan to guide the learning in the school, representing the body of knowledge to be transmitted.

Next comes instruction. The curriculum needs to be transmitted to the minds and behaviors of the students. This is where the teacher comes in. The curriculum is sequential. The receiving teacher expects the sending teacher to finish the entire curriculum for that year. In all my years of working with schools and school systems, that one teacher evaluation issue is rarely spoken of but very real: finish the curriculum for your grade!

7

Page 8: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

The third level is learning. Teacher lesson plans usually call for units. Students learn and are evaluated one unit at time. Students receive results written on tests, posted as lists, or in report cards.

The fourth level of schooling is “Show ‘em you know!” tests. A high percent of teachers, parents, and administrators do not include this fourth (and perhaps most important) level in the schooling process. By not doing so, they do the students a serious disfavor. The “Show ‘em you know!” tests are the connection between the learning level at the student’s future. They are the predictors of high school placement and college acceptability

The teacher push-back regarding helping students correct errors happens because “finish teaching the entire curriculum this year” trumps “raise individual student performance levels” in the mind of teachers. Why? Remember, the number of days in the school year is fixed. The curriculum is fixed. If “use the error analysis to raise student performance levels” needs attention, which one, the teacher asks, curriculum or school year, is going to give?

The only way that will be overcome is to make “raise individual student performance levels” a more important part of the teacher evaluation process than “finish teaching the entire curriculum this year.”

For a while, the Common Core School Standards seemed toward solidly connecting individual student growth and teacher evaluation. But, like the bumbling response states made to No Child Left Behind which consumed huge amounts of money and time with seriously mixed and flawed results, one can predict a lot of busy work, a lot of committees and a lot of expense with the usual absence of any clearly defined long-term, measureable outcomes with a reward system that stresses improvement and excellence and not the status quo. Once again, the response will probably punish schools with the audacity of have been built in low-income districts and reward those built amid the homes of the wealthy.

Coercion is needed because school rewards should be based on the number of individuals whose scores are the same or up from the prior year. That way, each student sets his or her baseline. One year later, “success” is defined by the percent staying the same or gaining, with special rewards gains in increased numbers attaining a defined excellence level. But that probably will not happen.

This long student performance improvement process ends with one final insight.

Before that, here is the information process guiding each student to an every higher performance level. These reports are the heart of the ISLIP approach.

Output student receives as a guide to improving performance

The first report the student receives is the Today-Tomorrow report. This report’s purpose is say to the student, as clearly as possible:

8

Page 9: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

1. Your performance today is at this level.2. If your performance stays at this level, when the time comes for

high school placement or college entrance tests, you will be at this level.

3. You could choose to improve on those predictions.4. Here is how to improve on those predictions.

For each of the three EXCEL output reports, the student first receives a page describing and explaining the output, with her or his own results used as examples.

9

Page 10: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Aaberg Aaron 06 No. Mankato School

This Today-Tomorrow Report which predicts future performance. Maximizing performance by 8th grade is in preparation for two tests. The highest possible performance on those two tests is important to your future.

In 8th grade a high school placement test like ACTs EXPLORE is usually given. Why is score so important? Because high performers are assigned to the more challenging high school classes where they rub shoulders with other high performers intent on learning. That provides a comfortable place to learn.

The report tells where Aaberg Aaron is headed. However by focusing on errors identified by tests taken those 8th grade scores could increase. Your plan provided here.

The second important test is a high school placement test such as ACT. Reliable empirical evidence shows that 8th grade performance is a very strong predictor of ACT score. The intervening high school years are not an effective time to make up for low elementary school performance. High schools do very little remediation. Elementaryschool is the time to start trying to attain the ACT score desired.

Here are some highlights from the Today Tomorrow report:

The composite percentile rank of Aaberg Aaron who is now in grade 06 is 91.

A series of predictions is made based on percentile ranks for the last two years. These appear in the next table:

Grade 8 ITBS Composite percentile rank predicted to be 88 .EXPLORE (8th grade placement) score (scale is 1 to 25) 20 .PLAN (10th grade test part of ACT series with scale 1-32) 23 .ACT (high school placement with scale 1-36) 27 .College Placement (see table second from bottom) 4.5 .

The next table shows how one college level higher could be reached. The table shows that the expected ACT score of 27 could be raised to 30 and that the college level would move from 4.5 to 5.1 .

Those are hard to grasp numbers. Thus the necessary change is translated to raw score units. Each year from fall grade 6 to end of grade 8 scores must gain this many raw score points:

Reading must grow by 1.9 raw score points per year.Language must grow by 2.5 raw score points per year.Math must grow by 2.3 raw score points per year.Composite must grow by 6.7 raw score points per year.

* Show 'em you know!

10

Page 11: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Aaberg Aaron Form: Today. Tomorrow N14xxxNo. Mankato School

Percentile ranks as of TodayBelow grade 6: See Predict 6

Grade 6 or above: See Predict 8Actual Actual Actual Actual Predict

Measure Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8Reading Total 89 91 83 91 87

Language Total 87 73 90 83 87Math Total 86 79 83 95 90Composite 87 82 86 91 88

Future: ITBS ITBSgrade 8 grade 8

Given your 'today' performance what do Scaled Percentile In grade 8 In grd. 10 Gr. 11-12 Collegehigh school/college scores look like? Score Rank EXPLORE PLAN ACT Level

Reading 281 87 18 21 25 4.2English 296 87 22 26 30 5.1Math 292 90 19 22 26 4.4

Composite 290 88 20 23 27 4.5

Here is what the College ACT score Nat. ACT LevelsLevel column means: range perc.rank Code: Based

Highly Selective 30 - 36 95 - 99 5.1 - 5.9 onSelective 24 - 29 74 - 94 4.1 - 4.9 most

Traditional 20 - 23 42 - 73 3.1 - 3.9 recentLiberal 16 - 19 21 - 42 2.1 - 2.9 ACTOpen Up to 15 Up to 20 1.1 - 1.9 norms.

Predicted scores assume in the future you will change nothing!

But if you decide to improve next Here is the PLAN!level is Highly Sel. These ITBS Scores scores Are needed:To attain that level: Grade 6 Grade 6 Grade 7 Grade 8 New New

Required SS and NPRs SS NPR SS SS ACT LevelReading Total 252 87 282 311

Language Total 265 87 281 296Math Total 260 90 284 307Composite 259 88 281 302 30 5.1

These Raw Score gains needed: RS Gain RS Gain Total RSTest 6 to 7 7 to 8 Gain

Reading Total 1.9 1.9 3.8Language Total 2.5 2.5 5

Math Total 2.3 2.3 4.6Composite 6.7 6.7 13.4

11

Page 12: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

The second report is the Past and Present report which summarizes the scores from current and prior tests taken by this student. The student sees:

1. How performance is reported for Common Core State Standards categories.

2. How frequently her or his scores have on up between grades. (Remember, the push for improvement is the central ISLIP theme.)

3. Her or his strongest and weakest performance levels.

The third report, the Fix Errors Report, is the one any student can use to actually cause student performance gains. These are the gains that can cause the predictions show in the Today-Tomorrow Report to be too low. Parents are allowed to pitch in and help in this effort.

1. The student see her or his top six error categories in English-Language Arts and Math. These are specific, identified, fixable errors.

2. The student sees the percent of all her or his errors these six from English-Language Arts and Math.

3. The student sees how much percentile rank gain in the next year’s testing can be achieved by correcting just ¾ of the errors in these highest error categories.

Each of these has a page of instructions, connected to the actual report. The instruction page and the actual table of values follows.

12

Page 13: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Aaberg Aaron Grade 06 06 No. Mankato School N14XXX

The eleven test categories in the tables conform to the Common Core State Standards which have now been accepted by most of the U.S. states. Here is how the eight basic subtests are used in the totals:

Conventions of Writing uses Spelling Capitalization and Punctuation, equally weighted.

English Language Arts Total uses Reading (33%)+ Written Expression (33% + Vocabulary (16%) and Conventions of Writing (18%).

Math Total uses Math (67%) and Math Computation (33%.)

The first big table has those eleven categories showing scaled scores. Scaled scores measure performance capacity. If a your scaled score goes up from grade 3 to 4 then your performance capacity increased which means you have more knowledge. Your scores went up from grade to grade 85% of the time.

A percentile rank compares this student's performance capacity of others at your grade level. A percentile rank of 81 means has the capacity to perform at or above 81% of students in your grade group in this country. A student who stays at the same level across the school years would have percentile ranks going up 50% of the time and going down 50% of the time.

Your percentile ranks were up 55% of the time grade to grade.

Some tests are 'rules-driven' or 'usage' tests where capacity to perform depends on how well she/he knows the rules. Examples of that are Conventions of Writing (Spelling-Capitalization-Punctuation) and Math Computation.

The other tests use specific new information that must be understood to answer the questions. Reading uses new information. Written Expression addresses paragraph and sentence order and clarity. Math applies prior learning to problems and data expression. Three comparisons are made. For you;

English Language Arts to Math About the same.Conventions to English Lang. Arts Conv. is higher.Math to Math Computation Math is higher.

If you have scores for at least four years you steepest gains and losseswill be shown.

Steepest gains Math (Significant)Steepest losses Capitalization (Significant)

Show 'em you know!

13

Page 14: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Aaberg Aaron Grade 06 Your scores: Past and present.  No. Mankato School N14xxx            

Score Card: Number Possible As a %      Scaled scores up 28 33 85%      

Scaled scores same or down 5 33 15%      Percentile rank scores up 18 33 55%      

Perc. rank scores same or down 15 33 45%      Compare ELA and Math totals About the same.      

Compare Conventions to ELA total Conv. is higher.      Compare Math to Math Computation Math is higher.      Steepest upward trend across grades Math Signif.        

Steepest downward trend across grades Capit Signif.                     

Scaled Scores measure your            capacity to perform on topic tested. If scores were up + appears.

             Test Grade 3 Grade 4 Grade 5 Grade 6* Grade 7 Grade 8

Vocabulary 196 207+ 229+ 250+ 0 0Reading Comprehension 207 240+ 232 265+ 0 0

Spelling (1/3 of Conventions) 185 199+ 245+ 245 0 0Capitalization (1/3 of Conv.) 231 229 265+ 275+ 0 0Punctuation (1/3 of conv.) 181 213+ 246+ 238 0 0

Written Expression 199 196 248+ 281+ 0 0Mathematics 193 212+ 231+ 270+ 0 0Computation 191 194+ 223+ 248+ 0 0

Conventions of Writing (CW) 199 214+ 252+ 253+ 0 0English-Language Arts Total (ELA) 201 215+ 241+ 266+ 0 0

Math Total (MT) 192 206+ 228+ 263+ 0 0             

Percentile Ranks (based on national            norms obtained in 2011)            

             Test Grade 3 Grade 4 Grade 5 Grade 6* Grade 7 Grade 8

Vocabulary 84 76 84+ 88+ 0 0Reading Comprehension 90 95+ 79 90+ 0 0

Spelling (1/3 of Conventions) 74 66 90+ 79 0 0Capitalization (1/3 of Conv.) 98 86 90+ 85 0 0Punctuation (1/3 of conv.) 60 75+ 81+ 66 0 0

Written Expression 83 56 83+ 89+ 0 0Mathematics 83 82 81 95+ 0 0Computation 91 63 82+ 85+ 0 0

Conventions of Writing (CW) 86 83 96+ 87 0 0English-Language Arts Total (ELA) 89 80 83+ 87+ 0 0

Math Total (MT) 86 79 83+ 95+ 0 0             

*Indicates year of norms change              Show em you know!    

Aaberg Aaron Grade 7 06 No. Mankato School N14XXX

14

Page 15: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

The engine that drives improvement: 'Fix Errors Report' *Suppose you got the problem '61 - 19 = ' wrong. Most testing programs will report 'Computation error' or 'Addition-subtraction error.' But the 'Fix Errors Report' actually identifies the error made. Examples of different errors for same problem:

An answer like 52 is a C01 Carry - borrow - rename errorAn answer like 43 is a C03 First digit is wrong errorAn answer like 80 is a C04 Wrong math operatorAn answer like 32 is a E02 Avoidable add/subtract error

In the English Language Arts test every single error you make is assigned to one of twenty-seven very specific error categories. In the Math Total part of the test every single error you make is assigned to one of twenty-six very specific error categories.

The 'Fix Errors Report' concentrates on the errors YOU make! And in the next test you take YOU WILL be given the opportunity to make the SAME ERRORS again. Repeat: Those wrong answer types will appear again. And again. And again!

Errors you made in the ELA area in your top six categories were : 19Errors you made in the Math Total area in your top six categories: 9

Help is available.

Your first ELA error was : E09 Written communication errorsGo to wickpartners.com/LanguageErrors and help is at page 30

Your first Math error was: C03 1st digit or dec. place wrg.Go to wickpartners.com/MathErrors and help is at page 41

More specific help information is provided to you for every error you made. By correcting 3/4 of the errors made this year before the next year's test, you can raise your scores this much:

ELA Total percentile rank from 87 to 90 .Reading percentile rank from 90 to 97 .Conventions percentile rank from 77 to 90 .Written Expr. percentile rank from 89 to 89 .Math Total percentile rank from 92 to 97 .Computation percentile rank from 85 to 93 .

* Show 'em you know!

15

Page 16: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

Aaberg Aaron Grade 7 ID   Fix Errors Form  No. Mankato School N14xxx          

             CCSS English Language Arts            

Each error coded as 1 of 27 categories.            Highest 6 categories listed below.            

             ELA top error categories Rank Errors % of all      

E09 Written communication errors Highest 5 14%   Correct em!E17 Capitalization proper nouns 2nd 4 11%   Correct em!

E11 Spelling: common rule errors 3rd 3 8%   Correct em!E27 Vocabulary--modifiers 4th 3 8%   Correct em!

E04 Contradicts part of passage 5th 2 6%   Correct em!E13 Spelling 'sounds like' error 6th 2 6%   Correct em!Total errors top six categories   19        

They represent this % of all errors     53%                   

CCSS Mathematics                         

Each error coded as 1 of 26 categories.            Highest 6 categories listed below.            

             Math top error categories Rank Errors % of all      

C03 1st digit or dec. place wrg. Highest 3 25%   Correct em!G02 Geometry real-world appli. 2nd 2 17%   Correct em!

A01 Basic pre-algebra skills 3rd 1 8%   Correct em!E05 Avoidable fraction error 4th 1 8%   Correct em!

D01 Error format or directions 5th 1 8%   Correct em!D02 Direct-read from display 6th 1 8%   Correct em!Total errors top six categories   9        

They represent this % of all errors     75%      Correct just 3/4 of errors            and make these gains:            

  PR now New PR Change SS Now New SS ChangeVocabulary 88 94 6 250 258 8

Reading 90 97 7 265 288 23Spelling 79 93 14 245 273 28

Capitalization 85 98 13 275 307 32Punctuation 66 79 13 238 262 24

Written Expression 89 89 0 281 281 0Mathmatics 95 99 4 270 287 17

Math Computation 85 93 8 248 261 13Conventions of Writing 77 90 13 253 281 28

Eng.-Language Arts Total 87 90 3 266 280 14Math Total 92 97 5 263 278 15

             Show 'em you know!            

16

Page 17: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

The fourth form, shown below, provides the “targeted interventions” plus a longer, more descriptive definition of that student’s six English-Language Arts and six Mathematics errors. The page number of the remediation package in the tutorial is provided.

Aaberg Aaron Grade 7 06 No. Mankato School Targeted interventions

English-Language Arts ErrorsE09 Written communication errors -- usage errors in letters, titles, abbreviations of individual titles like 'Dr.’, and incorrect use of apostrophe, colon, semicolon, dash or ellipse. See wickpartners.com/LanguageErrors page 30.E17 Capitalization proper nouns includes name of organizations, geographic locations, events, names of individuals or pets as well as holidays (unfamiliar holidays indentified in context.) See wickpartners.com/LanguageErrors page 58. E11 Spelling errors which fall under the category of commonly-taught rules for correct spelling. The four spelling categories include a long list of spelling words often found in tests of all kinds. See wickpartners.com/LanguageErrors page 39.E27 Vocabulary errors with words that were usually used as modifiers. A long coded list of words commonly used in Vocabulary tests is in the tutorial in a test format.See wickpartners.com/LanguageErrors page 77.E04 Contradicts part of passage. Answer chosen is contradicted by a part of the passage. Like Word Match error this error is mostly found with students who do not read the entire passage before responding to questions. See wickpartners.com/LanguageErrors page 1.E13 Spelling errors which fall under the category of 'sounds-like', words not spelled like they sound. The spelling categories include a long list of spelling words found in tests of all kinds. See wickpartners.com/LanguageErrors page 44.Mathematics ErrorsC03 Decimal is in the wrong place or one digit of the answer could have easily been identified as incorrect. In a problem such as 42 x 69 the student learns in tutorial that the ones digit must be a 8 (2 x 9=18). See wickpartners.com/MathErrors page 41.G02 Geometry real-world applications including real-world applications as well as basic knowledge about length-area-volume in a variety of mathematical problem formats.See wickpartners.com/MathErrors page 79.A01 Basic pre-algebra skills which includes elementary number sentence understanding which uses a family of facts in alternate formats. Area is greatly enhanced in importance by the Common Core State Standards. See wickpartners.com/MathErrors page 2.E05 Avoidable fraction error. Students learn to see a fraction like 15/16 as 'almost one' or 7/15 as 'almost one-half' to estimate a correct answer to a fraction problem.See wickpartners.com/MathErrors page 33.D01. Basic data display error , error made by not understanding the format of the table or graph or the directions given for displays (such as one dog picture stands for ten dogs). See wickpartners.com/MathErrors page 52.D02 Direct-read from display , Error made when requested to make a fairly simple-direct reads from a table- graph- or other data display or table often connected to using wrong axis for a reading. See wickpartners.com/MathErrors page 52.

17

Page 18: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

The final insight is mixed blessing.

Seven. Success is a satisfying feeling. For quite a number of years, my goal was to create a system that used test information to improve student performance on measures that predict future success. ISLIP does that. In the past two years, in schools using ISLIP, about 2 out of every 3 students showed gains, in year-to-year assessments.

The “mixed blessing” part is built around this final insight:

No one really cares.

When students and parents saw projections to high school and college placement test scores beginning at grade 4 in the Today-Tomorrow report, that information caught the attention of at least two-thirds. But the school follow through, the relentless day-by-day push for excellence which requires teacher support, would require change. Change is stilled properly connected to improvement, as explained by Newton, as “A body at rest remains at rest unless acted upon by an external force.”

Where is the external force?

A lot of effort and money went into No Child Left Behind. The good news: many, many students did attain higher performance levels. The bad news: the results from that eighteen year period do not seem to suggest our nation’s graduates have gained much in international comparisons. A detailed account can be found at wickpartners.com/StudentPerf1995-2012.

A brave response, called Common Core State Standards, was aimed toward toughening the curriculum and connecting teacher evaluations to student success. If recent reports of response by one state (Illinois) are indicators, CCSS success will mirror No Child Left Behind success. Why?

The key absent part is that “success” is again linked to “status quo.” ISLIP connects “success” to “improvement,” where improvement is defined as “a substantial percent of individuals have higher scores the second year than the baseline year,” and “status quo” to “OK, maintaining 50% up, 50% down is reluctantly accepted.”

Unlike ISLIP, elementary and middle school success is not firmly connected in the Illinois model to measures that predict critical life events, such as high school placement and college availability measures. Those apparently are delayed until high school. Success by the grade 8 level is critical. The pressure cannot be held back until high school. For success, the pressure has to start early in elementary school.

And, once again, if the school happened to be built in the middle of wealth, the administrators and teachers will not need to change anything to maintain the status quo in an easy and unchallenged manner. A school built in an impoverished community,

18

Page 19: 011aa0f.netsolhost.com011aa0f.netsolhost.com/.../StudentPerfGains-Guide.docx  · Web viewWay back August of 1981, the National Commission on Excellence in Education stated a goal

however, once again will play against a stacked deck. In the ISLIP approach, the school among wealth and the school among poverty establish their own baseline. The deck is not stacked. The most coveted annual reward is won only if year-to-year individual student performance scores (NOT the mean; number students) are UP from the baseline. Maintaining the status quo is a second place finish. That would not be very satisfying to a district spending $20,000 per year per student.

Is this country so wrapped up in discussions of national debt, abortion, same-sex marriage, benefits for the elderly and most needy, and climate changes that a central issue such as student performance is simply forgotten? What used to called “legislatures” have apparently been renamed “gridlock.” Does anyone believe that group would have the courage connect “school success” to “student performance improvement”? “Courage” is not a word people link anymore to “Representative” or “Senator.” In this day and age, “self-interest” works as an effective synonym for each elected group.

No. Frankly, the issue of student performance gains, which this country needs right now, is viewed as so unimportant it never even surfaces in any discussions.

That’s too bad.

19