Final AttendanceAudit 02112012
Transcript of Final AttendanceAudit 02112012
-
7/29/2019 Final AttendanceAudit 02112012
1/127
Statewide Audit of
Student Attendance Data
and Accountability System
February 11, 2013
Dave Yost
Ohio Auditor of State
-
7/29/2019 Final AttendanceAudit 02112012
2/127
On the Cover: Engraved phrase rom the
Thomas J. Moyer Ohio Judicial Center.
-
7/29/2019 Final AttendanceAudit 02112012
3/127
To the People o the State o Ohio:
In response to reports o irregular student attendance, enrollment and withdrawal practices within multiple school
districts and a statewide concern over the integrity o the Ohio Department o Educations (ODE) accountability and
reporting system, the Auditor o States Oce completed an audit in accordance with Ohio Revised Code Section
117.11. This audit includes an objective review and assessment o ODEs accountability policies, procedures and data,
and local school district attendance, enrollment, withdrawal and reporting practices.
This nal report includes an executive summary, project history, scope, objectives, methodology, and summary o the
audit. It also provides the results o the assessments and corrective action recommendations.
This engagement is not a nancial or perormance audit, the objectives o which would be vastly dierent. Thereore,
it is not within the scope o this work to conduct a comprehensive and detailed examination o local school report cards
or Ohios accountability system. Additionally, certain inormation included in this report was derived rom ODE,
Inormation Technology Center (ITC), and school district Student Inormation Systems (SIS), which may not be com-
pletely accurate. More than 260 AOS auditors were assigned to this engagement over the course o the audit and, as o
February 4, 2013, the audit cost was $443,099 and total audit hours were 10,807.
This report has been provided to the ODE and its results were discussed with the schools selected or testing. ODE
is encouraged to use the results o this review as a resource in improving its accountability guidance and compliance
monitoring.
Additional copies o this report can be requested by calling the Clerk o the Bureaus oce at (614) 466-2310 or toll
ree at (800) 282-0370. In addition, this report can be accessed online through the Auditor o State o Ohio website at
http://www.ohioauditor.govby choosing the Audit Search option.
Sincerely,
Dave Yost
Auditor o State
February 11, 2013
http://www.ohioauditor.gov/http://www.ohioauditor.gov/ -
7/29/2019 Final AttendanceAudit 02112012
4/127
TABLE OF CONTENTS
1. Executive Summary 1
2. Project History 6
3. Objectives and Scope 7
4. Overview o Accountabil ity 8
5. Overview o Statewide Student Identi ier 11
6. Breaking Enrollment 13
7. Support Roles in Accountabil ity 16
8. Use o Reports and Other Data Sources 17
9. Methodology 18
9.1. Jo hn Glenn School o Pu blic A airs , The Ohio St ate Un iversity 20
10. Summary o Results 30
10.1. Systemic Statewide Issues 30
10.2. Recommendations to General Assembly and Ohio Department o Education 34
10.3. Schools with Evidence o Scrubbing 45
10.3.1. Columbus City School District 45
10.3.2. Toledo City School District 46
10.3.3. Cleveland Municipal City School District 46
10.3.4. Cincinnati City School District 49
10.3.5. Marion City School District 50
10.3.6. Campbell City School District 50
10.3.7. Canton City School District 51
10.3.8. Northridge Local School District 53
10.3.9. Winton Woods City School District 53
10.4. Conclusion 54
11. Schools Selected or Testing 55
11.1. Phase One 55
11.2. Phase Two 60
11.3. Phase Three 64
12. Results o Students File Testing or Supporting Documentation 70
12.1. Schools with Evidence o Scrubbing 70
-
7/29/2019 Final AttendanceAudit 02112012
5/127
12.2. Schools with Errors 84
12.3. Clean Schools 104
12.4. Additional 28 School Districts 107
12.5. Phase Three Community Schools 110
12.6. Phase Three Other Schools 111
13. School District Exclusion List 114
14. Views o Responsible School O ic ials 116
15. Appendix 116
-
7/29/2019 Final AttendanceAudit 02112012
6/127
Statewide Audit of Student AttendanceData and Accountabil ity System
1. EXECUTIVE SUMMARY
In response to requests rom Columbus City Schools and the Ohio Department o Education
to examine student attendance reporting practices, along with reports o actual or suspected
inaccuracies in attendance reporting practices at several school districts in Ohio, the Auditor
o States oce initiated a statewide review o attendance reporting in July 2012. This is the
nal report o this audit work.
The purpose o this review was threeold: (1) to identiy systemic, and potentially duplicitous,
student attendance and enrollment practices among Ohio schools; (2) to provide recom-
mendations to the Ohio Department o Education (ODE) and Ohio General Assembly or
making uture policy and legislative improvements to Ohios accountability system; and (3) to
determine whether schools were scrubbing enrollment data.
Ordinarily, local report cards include only students enrolled or the ull academic year, or
FAY. A student must be enrolled continuously at a single school rom the end o October
count week to May 10th or grades 3-8 or March 19th or all other grades to qualiy or the
ull academic year o attendance. When a lawul break in enrollment occurs (e.g., a student
relocates to a new district), school districts move the students test scores to the States reportcard; in such cases the scores no longer appear in the accountability data or the local dis-
trict. Furthermore, i a student transers between schools within the same school district, the
students test score is similarly moved or rolled up rom the school report card to the school
districts overall report card.
Phase One: First Interim Report
The Auditor o States oce issued its rst report October 4, 2012. The initial phase o the
audit selected 100 schools rom 47 school districts with the highest number o students that
took assessment tests and whose test scores were subsequently rolled up to the State, thereby
alleviating the district rom accountability or perormance o those students. Five school
districts identied in the report were ound to have improperly withdrawn students rom
their enrollment. They were Campbell City School District (Mahoning County), Cleveland
Municipal City School District (Cuyahoga County), Columbus City School District (Frank-
lin County), Marion City School District (Marion County), and Toledo City School District
(Lucas County).
Phase wo: Second Interim Report
In November, 184 school districts in Ohio had levies or bond issues on the ballot. To alleviate
to the extent practicable concerns about these districts, the Auditor o States oce selected 81
schools in 47 districts to test or questionable student attendance practices in the second phase
o the statewide audit, issued October 23, 2012. The schools tested in the rst phase o theaudit were excluded rom the second phase sample. O the 81 schools tested in this phase:
53 schools were considered clean with no issues identied to date;
20 schools had records containing sporadic errors; and
8 schools still had testing ongoing and were considered indeterminate at the time o
the report.
The Auditor o States oce also excluded an additional 26 districts rom testing based on
their low percentage o tested students rolled up to the State or the 2010-2011 school year.
Te Defnition o Scrubbing:
This report defnes scrubbing asremoving students rom enroll-ment without lawul reason,regardless o motivation. Theterm does not necessarily implymalicious intent.
-
7/29/2019 Final AttendanceAudit 02112012
7/127
2Statewide Audit of Student AttendanceData and Accountabil ity System
Phase Tree: Final Report
This report constitutes the third and nal phase o the student attendance data and account-
ability audit.
Schools with Evidence o Scrubbing
The nal report identies our school districts in addition to the ve school districts identied
in the October 4 report that were ound to have improperly withdrawn students rom their
enrollment. The additional districts are marked in boldace below.
The nine districts are Campbell City School District (Mahoning County), Canton City
School District(Stark County), Cincinnati City School District(Hamilton County),
Cleveland Municipal City School District (Cuyahoga County), Columbus City School Dis-
trict (Franklin County), Marion City School District (Marion County), Northridge Local
School District(Montgomery County), Toledo City School District (Lucas County), and
Winton Woods City School District(Hamilton County).
Schools with Errors
More than seventy (70) schools or districts were identied as having errors in attendance
reporting. Auditors did not conclude that these errors were evidence o scrubbing.
The Auditor o State recommends that ODE review schools with evidence o scrubbing or
with errors to determine whether any urther assessment o the school report cards by ODE is
necessary, and also to inorm ODE judgments regarding the recommendations in this report.
Recommendations
Kids Count Every Day
The Auditor o State recommends basing State unding upon year-long attendance numbers,
i.e., that money ollow the student in approximate real time. Doing so would create an envi-ronment in which school districts that currently use attendance incentives or October count
weekoten with great successwould themselves have incentives to encourage attendance
throughout a students entire year. Importantly, schools that break enrollment under such a
system would suer a loss o unding as a result.
Increase Oversight o School Districts
While ODE has relied heavily on an honor system or district reporting, the system should
be reormed by introducing independent oversight. Both ODE and districts would benet
rom expanded cross-checks and data monitoring throughout the school year. This would
greatly enhance ODEs ability to identiy and correct mistakes or detect raud in data report-
ing, particularly the Education Management Inormation System (EMIS). EMIS monitoring
unctions should be perormed by an independent agency or commission appointed by the
General Assembly.
I it is not easible to conduct such monitoring eorts throughout the school year, then moni-
toring should be conducted in close proximity to the close o the academic school year. ODE
and the General Assembly should consider enacting penalties and taking corrective measures,
such as temporary suspension o State Foundation unding or ederal unding or noncompli-
ant schools, until signicant inaccuracies are ully corrected by noncompliant schools.
While ODE has relied heavilyon an honor system or districtreporting, the system should bereormed by introducing indepen-dent oversight.
-
7/29/2019 Final AttendanceAudit 02112012
8/127
Statewide Audit of Student AttendanceData and Accountabil ity System
The widespread nature o data irregularities and questionable attendance practices demon-
strates, at the very least, a lack o oversight by ODE over attendance reporting. To the extent
that existing statutes contribute to an environment that makes ODEs role unclear, or cumber-
some, those statutes should be amended to refect the need or a robust, State-level account-
ability unction within the Ohio tradition o local school control. Such changes may require
additional resources or re-tasking existing resources to accomplish.
Monitor Programs or At-Risk Students
ODE assigns unique internal retrieval numbers (IRNs) to all schools, districts and certain spe-
cial academic programs. AOS recommends ODE regularly monitor assigned IRNs to ensure
schools are still using their approved IRNs or the originally-intended purpose. Additionally,
AOS recommends the General Assembly provide express authority to ODE or another ap-
propriate agency to monitor and independently veriy at-risk student transers to alternative
school programs to ensure such transers are made or valid legal reasons and the respective
student perormance ratings are refected in the appropriate school or States report card. This
will provide greater consistency in the accountability data among schools or students receiv-
ing interventions in lieu o expulsion or suspension.
Increase EMIS raining
The General Assembly should develop minimum continuing proessional education require-
ments or school personnel who use EMIS. Currently, ederal and State laws do not do so.
Especially when one considers that Federal and State accountability rules and regulations are
urther complicated by the Ohio school unding model1 (which is separate and distinct rom
Federal and State accountability provisions), it is little wonder that education stakeholders
have observed inconsistencies in report card data or instituted policies and practices that, in
some cases, may cause errors in accountability. Providing baseline and continuing education to
school EMIS personnel is critical to shoring up and ensuring the integrity o Ohios account-
ability system.
Increase Use o Automation to Protect Data and Process Integrity
AOS recommends the General Assembly consider enacting legislation and providing the
necessary unding to implement an automated student perormance assessment-based testing
system. This would allow more prompt reporting o test results, enabling inormation about
progress toward college and career readiness to be included on report cards on a more timely
and consistent basis. It also would signicantly reduce risk o error or omission. As part o
this initiative, the General Assembly should consider a needs-assessment study to appropri-
ately nance this system and ensure a reasonable implementation period that considers the
needs o all users. This may require certain steps to be phased in over time. AOS urther
recommends the General Assembly require test administration by independent proctors andthat vendors submit student assessment scores directly to ODE throughout the year to be used
or the calculation o adequate yearly progress (AYP) and the local report card.
State Monitoring o Student Withdrawals
To improve monitoring eorts, ODE should generate statewide school reports by student
name and SSID number or key enrollment and withdraw codes. ODE should utilize these
1 Governor Kasich announced his plan, Achievement Everywhere: Common Sense or Ohios Classrooms, on
January 31, 2013. This proposed plan is a part o the 2014-2015 biennial state budget and could impact the States
school unding model.
-
7/29/2019 Final AttendanceAudit 02112012
9/127
4Statewide Audit of Student AttendanceData and Accountabil ity System
reports to perorm analyses and cross-check the timing o student withdrawals and subsequent
enrollments against EMIS data reported by individual schools or completeness and accuracy.
Statewide Student Identifer System
The General Assembly should change existing law to allow the ODE to have access to names
o students and other personal inormation with necessary privacy protections consistent with
Federal law. This statutory constraint imposes signicant costs on both ODE and on users o
the Statewide Student Identier (SSID) system without providing additional privacy protec-
tions beyond those required by Federal law. Only two states have been identied that operate
under such restriction. This recommendation was given in an interim report o the peror-
mance audit o ODE issued October 8, 2012. The nding and recommendation was urther
supported during the review o attendance data. This system was an impediment to our
auditors and should be removed to allow ODE to have access to student names and necessary
inormation, with privacy protections.
Establish Separate racking or Community School Withdrawals
AOS recommends ODE create a separate and distinct withdraw code in EMIS or community
schools, because o unique requirements or community school unding and monitoring.
Protect Report Card Results rom Security Vulnerabilities
ODE should remove the report card perormance rating inormation rom the Secure Data
Center (SDC), allowing school districts only to veriy EMIS data submissions with no access
to projected rankings. This will reduce schools ability to change the outcome o their local
report card. While the concept o the SDC was to correct or veriy EMIS inormation, allow-
ing school districts to realize the projected report card ratings prior to the nalization o EMIS
data gives the school districts the opportunity to intentionally scrub or change EMIS report
card data to improve the outcome o the districts nal report card ratings.
Centralize Accountability Resources
ODE should provide a centralized index that helps connect accountability resources main-
tained in various locations on its website or school districts to use in reporting student atten-
dance, enrollment, and other important report card actors. ODE should develop a centralized
location on its website to provide clear instruction on accountability requirements and how
they relate to EMIS reporting.
Statewide Student Inormation System
The General Assembly should establish a single statewide student inormation system so that
all data is uniorm, consistently reported, and accessible or data mining. Alternatively i such
is not easible the General Assembly should require ODE to approve the Student Inormation
System used by each district in the State to ensure it meets requirements.
Document Student Withdrawals
ODE should clariy its EMIS Manual and administrative rules to require (and not merely
suggest) what types o evidentiary documentation must be maintained or each o the EMIS
withdraw codes.
ODE should remove thereport card perormance ratinginormation rom the SDC.This will reduce schools abil-ity to change the outcome otheir local report card.
-
7/29/2019 Final AttendanceAudit 02112012
10/127
Statewide Audit of Student AttendanceData and Accountabil ity System
Withdrawal o Foreign Exchange Students
ODE should revise itsAccountability Workbookand Where Kids Count Business Rulesto provide
clarity on enrollment issues pertaining to oreign exchange students. During testing o student
attendance and accountability records, AOS observed inconsistent treatment among schools
o oreign exchange students. Due to the lack o ODE guidance in this matter, it is unclear
whether a break in enrollment was appropriate in these circumstances.
Conclusion
This report includes ndings rom the AOS statewide assessment o school year 2010-11 stu-
dent attendance and enrollment practices or select Ohio schools. AOS will reer the schools
with evidence o scrubbing to ODE or urther investigation and recalculation o the school
report cards. AOS also will request that ODE consider reviewing the schools with errors
identied in this report to determine whether the number or nature o errors AOS identied
requires urther assessment o the school report cards by ODE. Similarly, the schools with
evidence o data scrubbing will be reerred to the U.S. Department o Education Oce o the
Inspector General (IG) or review. It is anticipated that the IG will review these ndings in
the context o Federal law, and will consult with the United States Attorneys or the Northernand Southern Districts o Ohio.
AOS also updated its regular school district nancial audit and single audit procedures to
include testing or irregular attendance practices and potential scrubbing or scal year 2011-
2012 and subsequent audit periods.
The Auditor o States oce extends its gratitude to the State Board o Education, the Ohio
Department o Education, and the many school districts and organizations throughout the
State that supported and cooperated with this audit.
In conducting this audit, the Auditor o States oce worked extensively with The John Glenn
School o Public Aairs at The Ohio State University to develop statistical procedures and
data management strategies in support o audit goals. The Auditor o State expresses his ap-preciation to The Ohio State University or its valuable contribution.
Most importantly, the Auditor o States oce extends its gratitude to the people o Ohio or
supporting this work.
-
7/29/2019 Final AttendanceAudit 02112012
11/127
6Statewide Audit of Student AttendanceData and Accountabil ity System
2. PROJECT HISTORY
The Elementary and Secondary Education Act (ESEA) was amended by the No Child Let
Behind (NCLB) Act o 2001, which was signed into law on January 8, 2002. Under the
NCLB model, a schools report card species its perormance as compared to other schools in
Ohio. Specically, the NCLB school report card displays student achievement data in read-
ing, mathematics, science and other core subjects required by the State so that parents andthe public can see how their schools are progressing over time. In addition, the report card
includes inormation on student attendance rates and graduation rates.
A schools perormance on the report card can be aected by the students counted in the
scoring. I the scores o low-perorming students can be excluded rom a particular schools
report card, the overall perormance o that school shows a corresponding improvement. This
eect is described in a July 25, 2012, letter rom the Ohio Department o Education (ODE)
to the Lockland School District which ound that attendance data had been alsely reported
and ODE revised downward the school districts report card rating. A copy o this letter is
provided in the Appendix o this report.
There are our components to Ohios accountability system. They are State Indicators,Perormance index Score, Value-Added, and Adequate Yearly Progress (AYP). The State
Indicators are generally based on the number o State assessments given over all tested grades.
To earn each indicator, a district or school needs to have a certain percentage o students
reach procient or above on a given assessment. Student test scores on the Ohio Achieve-
ment Assessment (OAA) and the Ohio Graduation Test (OGT) are State Indicators or the
2010-11 school year. The percentage o students per grade and test that were enrolled in the
district or a Full Academic Year (FAY) are counted in the local report card. To have a day
counted as an attendance day or meeting the FAY criterion, a student must be enrolled and
in attendance during the year or be on expulsion status and receiving services rom the school
district (i the school district has adopted a policy as stated in paragraph (C) o Rule 3301-18-
01 o the Ohio Administrative Code). Sometimes, however, allowable events occur that causestudent scores to be removed rom the local composite and included only in the statewide
composite score.
Under No Child Let Behind (NCLB), there are several allowable ways student test scores can
be excluded rom an individual schools report card and rolled up to the school district wide or
State report card as described in ODEs Where Kids Count (WKC) Methodology, a docu-
ment available on ODEs website that explains ODEs business rules or counting students
in the school, district-wide, and State-level report cards. Students do not always count at the
school in which they are enrolled. For example, when adistrictmakes the decision to educate
a student in a location other than the resident school, the student will be counted in the
resident schools results. An example is a school that educates all o the Limited English Pro-cient students in the district because o expertise or resources in one school those students
will count in their resident schools report card results. Conversely, when a parent, guardian,
or the courts place a student in another educational setting, those students will count in the
educating schools report cards results or, i in attendance or less than the FAY, those students
will be rolled up to the State report card.
Our report ocuses mainly on breaks in enrollment, which cause student test scores to be
rolled up to the statewide composite report card. In this scenario, the local report card
includes only students enrolled or the FAY. A student must be enrolled continuously at a
single school rom the end o October count week to May 10th or grades 38 or March 19th
or all other grades to qualiy or the ull academic year o attendance. When a lawul break in
I the scores o low-perormingstudents can be excluded roma particular schools report card,the overall perormance o thatschool shows a correspondingimprovement.
-
7/29/2019 Final AttendanceAudit 02112012
12/127
Statewide Audit of Student AttendanceData and Accountabil ity System
enrollment occurs, school districts roll the students test scores to the States report card. Fur-
thermore, i a student transers between schools within the same school district, the students
test score is rolled up to the school districts overall report card. Schools break enrollment by
withdrawing or enrolling students between October count week and the end o the academic
school year, which can occur routinely among some Ohio public school districts.
Amid the tough economic pressures and rigorous ederal perormance ranking requirements,
some schools are incentivized to remove students with high absenteeism and lower test scores
rom their local report cards to boost perormance measures used to determine government
aid and improve school perormance rankings. In act, some schools also receive nancial
bonuses based on the schools ranking.
3. OBJECTIVES AND SCOPE
On August 11, 2011, Dr. Gene Harris, Superintendent o the Columbus City School District
(CSD) requested that the Auditor o State (AOS) review the district internal auditors nding
that there were absences deleted rom the Columbus CSD school attendance records. Dr.
Harris indicated the Columbus CSDs internal auditor was made aware o these changes roma truancy ocer who was handling a court truancy ling. The truancy ocer discovered the
absences originally recorded in the student attendance records or the students in question
were altered ater charges had been led. AOS met with district ocials noting isolated atten-
dance irregularities and requested Columbus CSD continue to investigate the attendance data
internally and contact AOS i urther discrepancies were noted.
Later, on June 15, 2012 the AOS was requested by Columbus CSD to meet with their
internal auditor to discuss the results o an internal audit on student withdrawal activity ater
an article was published in the local newspaper, TheDispatch. A representative o the AOS
met with the Internal Auditor at Columbus CSD soon thereater. Additional allegations o
irregular attendance and enrollment practices suraced in Toledo and ODE uncovered similar
practices in Lockland School District, leading to questions about the integrity o Ohios ac-
countability system statewide. As a result, AOS initiated a statewide systematic and objective
assessment o school year 2010-11 student attendance and enrollment systems or more than
100 schools among 74 Ohio school districts.
The purpose o this review was threeold: (1) to identiy systemic, and potentially duplicitous,
student attendance and enrollment practices among Ohio schools; (2) to provide recom-
mendations to the ODE and Ohio General Assembly or making uture policy and legislative
improvements to Ohios accountability system; and (3) to determine whether schools were
scrubbing enrollment data.
This engagement is not a nancial or perormance audit, the objectives o which would be
vastly dierent.2 Thereore, it is not within the scope o this work to conduct a comprehen-
sive and detailed examination o local school report cards or Ohios accountability system.
Additionally, certain inormation included in this report was derived rom ODE, ITC, and
school district SIS, which may not be completely accurate.
2 The AOS does not proclaim this work to be a perormance audit in accordance with Generally Accepted
Government Auditing Standards(GAGAS). By denition, a perormance audit reers to an examination ofa program, function, operation or the management systems and procedures of a governmental or
non-proit entity to assess whether the entity is achieving economy, eficiency and effectiveness in the
employment of available resources. The examination is objective and systematic, generally using structured andproessionally adopted methodologies; however, adherence to standards is not a requirement.
-
7/29/2019 Final AttendanceAudit 02112012
13/127
8Statewide Audit of Student AttendanceData and Accountabil ity System
4. OVERVIEW OF ACCOUNTABILITY
Prior to the ederal NCLB Act o 2001, Ohios accountability system ocused on districts, not
individual schools. The Ohio General Assembly put the accountability system in place or
Ohio schools and districts in 1997. ODE began issuing ocial report cards at the student,
school, and district levels in February 2000 (or the 1998-99 school year). Parents o school-
aged students received reports o their childrens perormance on prociency tests, the averageperormance on prociency tests at their childrens schools (as well as other measures such as
attendance and graduation rates), and the district perormance (which included prociency
test results, attendance and graduation rates, and a number o other perormance measures).
ODE and public libraries also made these report cards and related data available to the general
public on their websites.
Whereas publicizing data might have provided incentives or students, school, and districts to
improve their perormance, the accountability system at this time ocused only on districts.
Districts received various designations based on how many perormance indicators they met.
Originally, designations were based on 27 indicators (increased rom 18 in 1997) that were
given equal weight. The two non-cognitive indicators were based on requirements or a 93%attendance rate and a 90% graduation rate. The remainder o the indicators ocused on the
percent o procient students according to State tests. The perormance designations were
calculated as ollows:
Eective (26 or more indicators met),
Continuous Improvement (CI; 14 to 25 indicators met),
Academic Watch (AW; 9 to 13 indicators met); and
Academic Emergency (AE; 8 or ewer indicators met).
Out o more than 600 school districts in 2000, ODE deemed only 30 as eective and 200 as
AW or AE. Districts labeled CI, AW, and AE were required by ODE to develop a three-year
Continuous Improvement Plan (CIP). ODE regulated the contents o the CIP more heav-
ily or AW and AE districts, including a requirement that ODE review those plans. Districts
labeled as CI or below had to meet a standard unit o improvement every year. Thus,
districts ailing to meet the eective rating aced a long road o State administrative inter-
vention. These sanctions began in the 2000-01 school year. School districts in AW and AE
also received nancial and technical assistance rom ODE.
For the 2010-2011 school year, designations were based on 26 perormance indicators with
scores on assessment tests at 75% procient or above. I the percentage o students scoring
at or above the procient level is greater than or equal to the State minimum standard then
the district met the standard or that State indicator. I the percentage o students at or above
the procient level is below the State minimum standard then the district did NOT meet thestandard or that State indicator. In sharp contrast to ODEs district designation rankings
in 2000, or the 2010-11 school year, ODE deemed 215 school districts as Eective, 36 as
CI and 6 as AW or AE. The approximately 352 remaining school districts were Excellent or
Excellent with Distinction.
Adequate Yearly Progress
Adequate Yearly Progress (AYP) originated rom the Federal No Child Let Behind (NCLB)
Act o 2001. The legislation led Ohio to calculate school-level ratings beginning in the 2002-
03 school year and to incorporate the NCLBs new AYP requirement in the accountability
-
7/29/2019 Final AttendanceAudit 02112012
14/127
Statewide Audit of Student AttendanceData and Accountabil ity System
system. The AYP metric itsel changed the accountability system by 1) ocusing attention on
a particular set o indicators and 2) imposing signicant sanctions i schools or districts ailed
to meet any AYP indicator or more than one year (with some caveats). The Ohio AYP indica-
tors included meeting prociency targets in math and reading or all o ten student subgroups,
achieving attendance and graduation rates o 93% and 90% respectively, and meeting test
participation rate requirements. The attendance rate requirement applied to elementary and
middle schools and the graduation rate requirement applied to high schools.
The Federal NCLB requires Ohio to set AYP goals each year and raise the bar in gradual
increments so that all o Ohios students are procient on State reading and mathematics as-
sessments by the 2013-2014 school year. To this end, Title I, Sections 1116(a) and (b)(1), (7),
and (8) o the Elementary and Secondary Education Act (ESEA) (20 USC 6316(a) and (b)(1),
(7), and (8)) and 34 CFR Sections 200.30 through 200.34 require school districts annually re-
view the progress o each school served under Title I, Part A to determine whether the school
has made AYP. Every school and district must meet AYP goals that the ODE Accountability
Model (approved by USDOE) has established or reading and mathematics prociency and
test participation, attendance rate and graduation rate. AYP determinations or districts and
schools are based on test participation and prociency rate goals. These goals are evaluated
or the student groups when the minimum subgroup size has been met. AYP graduation and
attendance goals are evaluated or the All Students group only. Failure to meet any o the
prociency or participation goals, attendance levels or graduation targets results in the district
or school not meeting AYP.
Title I, Sections 1111(h)(2) and 1116(a)(1)(C) o ESEA (20 USC 6311(h)(2) and 6316(a)
(1)(C)) and 34 CFR Sections 200.36 through 200.38 also require each school district that
receives Title I, Part A unds prepare and disseminate to all schools in the districtand to all
parents o students attending those schoolsan annual district-wide report card that, among
other things, includes the number, names, and percentage o schools identied or school
improvement and how long the schools have been so identied.
Districts and schools that do not make AYP or two or more years in a row move into District
Improvement or School Improvement status. Once they are in improvement status, districts
and schools receive support and intervention and are subject to consequences. Districts and
schools in improvement status must develop an improvement plan and keep parents inormed
o their eorts. Consequences escalate the longer a district or school is in improvement status,
and range rom using Title I unds to oer school choice, provide transportation to students
electing to attend another school, and arrange or supplemental services, such as tutoring or
students (Title I unded schools only), to restructuring o the school or district governance.
Schools must identiy or school improvement any school that ails to make AYP, as dened
by ODE, or two or more consecutive school years. In identiying a school or improvement,
ODE may base identication on whether the school did not make AYP because it did not
meet (1) ODEs annual measurable objectives or the subject or (2) the same other academic
indicator or two consecutive years.
The AYP calculations are applied separately to each school within a district and the district
itsel. The AYP determination or the district is not dependent on the AYP status o each o
the schools (e.g. School A met AYP and School B met AYP so the district met AYP). Instead
the calculations are applied again to district level data (e.g. School A had 20 out o 50 students
who were procient or above and School B had 35 out o 60 students who were procient or
above, so the District had 55 out o 110 students who were procient or above). Thereore, it
is possible or schools within a district to meet AYP while the district itsel ails to meet AYP.
-
7/29/2019 Final AttendanceAudit 02112012
15/127
10Statewide Audit of Student AttendanceData and Accountabil ity System
A school or district can miss AYP and earn Excellent or Eective designations or only
two consecutive years. With the third year o missing AYP, the school or district designation
drops to Continuous Improvement at which point the school district must take corrective
measures including, but not limited to, restructuring.
Where Kids Count
Every school year, thousands o students change schools or a variety o reasons. While ami-
lies living in poverty have the highest mobility rates, oster children and children in military
amilies also move requently. Mobility can negatively aect a students learning, achieve-
ment, social supports, physical and mental health. Since schools are graded based on student
achievement, attendance and graduation, a key question or the accountability system is:
which school do mobile students belong to or scoring purposes?
This question is actually a series o questions and is more complex than it might at rst ap-
pear. The answers are governed by the Where Kids Count (WKC) rules. The Full Academic
Year rule is a specic WKC rule that states how long a student must be enrolled in a school or
district or their test score to count toward that entity.
Students who count toward a resident district or school designation under Ohios accountabil-
ity system are those who:
Met the ull academic year criterion (i.e., the student was enrolled and unded during
the October unding count week and continuously enrolled through the spring test
administration).
Attended a JVSD, ESC, or Postsecondary Institution and met the Full Academic Year
criteria at the district level.
Enrolled in a special education cooperative program educated at another district and
met the Full Academic Year criteria at the educating district.
However, as described earlier in this report, students do not always count at the school inwhich they are enrolled. Students that are court- or parent-placed into an institution within
the district or State school will not count at the school or district level. Students that only
receive services rom a district do not count in the accountability calculations or the report-
ing district or school. Examples o a student who only receives services would be one who
participates in latchkey programs or a student that is not enrolled but receives career-technical
evaluation services.
Flexibility Waiver
Ohios accountability system, which had previously ocused on districts and a certain set o
perormance indicators, was modied so that Ohio could meet Federal accountability require-ments due to NCLB. By the 2002-03 school year, ODE labeled both schools and districts as
Excellent, Eective, Continuous Improvement, Academic Watch, or Academic Emergency
based on a new set o indicators. Ohio has modied its accountability system since then, add-
ing new perormance indicators and changing the ormula or assigning school perormance
designations. In recent years Ohio has complicated the system urther with rewards and
sanctions based on its own accountability designations, and the State received certain Federal
exemptions related to AYP sanctions. Nevertheless, the NCLBs AYP requirements arguably
had the greatest infuence on perormance ratings and imposed the greatest potential adminis-
trative sanctions.
-
7/29/2019 Final AttendanceAudit 02112012
16/127
1Statewide Audit of Student Attendance
Data and Accountabil ity System
For the 2010-11 school year, Ohio was operating under a fexibility agreement with the U.S.
Department o Education (USDOE) pursuant to Section 9401 o the Federal Elementary and
Secondary Education Act (ESEA). This agreement permitted Ohio to include its dierenti-
ated accountability model as part o its system o interventions through the 2011-12 school
year, unless reauthorization o the ESEA changes the requirements on which Ohios model is
based. As part o this fexibility agreement, Ohio had to agree to certain conditions detailed in
the USDOE August 2008 Condition Letter. Despite this waiver, however, student attendanceand enrollment remained an integral part o Ohios accountability system and the local report
cards.
Additionally, on September 23, 2011, USDOE oered each state the opportunity to request
fexibility on behal o itsel, its local education agencies, and its schools regarding specic
ESEA requirements, including certain Title I, Part A requirements, pursuant to authority
in Section 9401 o the ESEA (20 USC 7861), which allows the Secretary o Education to
waive, with certain exceptions, statutory and regulatory requirements o the ESEA. USDOE
approved Ohios ESEA Flexibility Waiver request in June 2012. The Ohio ESEA Flexibility
Waiver has a conditional approval and took eect or the 2012 2013 school year. Ohio
must submit an amended request with the nal administrative rules or the A-F school grad-
ing system to USDOE by June 30, 2013 in order to continue to receive ESEA Flexibility. It is
important to note, however, that i Congress reauthorizes ESEA between now and the 2014
2015 school year, the reauthorized law would take priority over Ohios waiver.
2012-2013 ESEA Flexibility Waiver districts will have fexibility rom sanctions and report-
ing requirements previously mandated in ESEA. In order to receive this fexibility, Ohio
has agreed to adapt college-and-career-ready expectations, dedicate more resources to close
sub-group achievement gaps and implement an evaluation system that will support eective
instruction and leadership including, but not limited to:
Implementation o rigorous standards, assessments and principal and teacher evalua-
tions;
Replacement o the Adequate Yearly Progress (AYP) measure, which had the unrealistic
goal o 100 percent prociency or reading and mathematics or every student in every
demographic group. The new measures include rigorous, but realistic, objectives that
aim to cut the achievement gap in reading and mathematics by hal over six years,
while requiring higher perormance rom all students;
Changing the existing rating o schools to an A-F letter-grade system that will be easier
to understand and give a realistic picture o school perormance. The system and or-
mula will ocially begin with the report cards released in August 2013;
Freeing schools rom some reporting requirements and giving them greater fexibility in
their use o Federal unds or proessional development and other purposes.
5. OVERVIEW OF STATEWIDE STUDENT IDENTIFIER
The Statewide Student Identier (SSID) System is the cornerstone o ODEs student-level
Education Management Inormation System (EMIS), a statewide data collection system or
Ohios primary and secondary education, including demographic, attendance, course inor-
mation, nancial data and test results. The SSID System assigns a unique identier to every
student receiving services rom Ohios public schools. This code will ollow students as they
move within and between Ohio districts, enabling studies o student progress and peror-
mance trends over time. The system has the ollowing unctions:
The SSID System assignsa unique identifer to everystudent receiving services romOhios public schools, but Ohiolaw restricts ODE access to thestudents personally identifableinormation.
-
7/29/2019 Final AttendanceAudit 02112012
17/127
12Statewide Audit of Student AttendanceData and Accountabil ity System
Prevents the identication o actual student names, social security numbers, or other
personal data that could breach individual condentiality.
Stores matching data and associated student identier code throughout the course o
each childs education.
Facilitates assignment o individual SSIDs or mass assignment o SSIDs through batch
processing or an online, web service.Federal Family Educational Rights and Privacy Act (FERPA), 20 U.S.C. 1232(g), and Ohio
Rev. Code 3301.0714 give guidance regarding proper and improper practice or records
maintenance and transer.
Ohio law restricts ODE access to certain personally identiable student inormation. ORC
3301.0714 states, the guidelines shall prohibit the reporting under this section o a students
name, address, and social security number to the state board o education or the department
o education. The SSID System does not replace a districts student inormation system
sotware, nor is it the entirety o the student level EMIS. It is a duplicative system designed to
connect the districts student sotware system to ODEs student level EMIS database. Pursu-
ant to the aorementioned Ohio law, ODE uses only the SSID, in lieu o personally iden-
tiable student inormation, or EMIS reporting purposes to protect the privacy o student
records. Only school districts can access the crosswalk that links personally identiable stu-
dent inormation to the SSID reported to ODE in EMIS. In addition to the complications
noted herein, Ohios system creates duplicative costs that have been reported in this oces
separate, ongoing perormance audit o ODE.
Per the Ohio Revised Code 3313.672, school districts are required to obtain reliable identi-
cation rom parents upon enrollment in public schools. This can be obtained rom birth cer-
ticates, passports, or immigration orms, or example. Ohio Revised Code 3301.0714(D)
(2) urther provides the ollowing guidance:
Each school district shall ensure that the data verication code is included in the
students records reported to any subsequent school district or community school in
which the student enrolls and shall remove all reerences to the code in any records
retained in the district or school that pertain to any student no longer enrolled. Any
such subsequent district or school shall utilize the same identier in its reporting o
data under this section.
ODE provides verication reports to districts that will assist in determining whether two
students have been assigned the same SSID. These reports will speciy whether SSIDs are
missing, invalid, or have potentially been used or multiple students.
The only reason to delete a SSID is i it is proven to be a duplicate SSID. I a student moves
out o state, transers to a private school, dies, withdraws or graduates, the SSID should not
be deleted. Generally, a record deletion actually deactivates the SSID rom the production
SSID database so that it can no longer be used. ODE cautions school districts that unless
the deletion is conducted as part o a system-wide duplicate clean-up process, school districts
should coner with other reporting entities using dierent SSIDs or the same student prior to
making the deletion. I a deletion is conducted in error, school districts may contact IBM or
assistance in re-activating the record.
-
7/29/2019 Final AttendanceAudit 02112012
18/127
1Statewide Audit of Student Attendance
Data and Accountabil ity System
6. BREAKING ENROLLMENT
The school report card perormance measures, and rewards and sanctions, associated with
Ohios accountability system have changed over time. The incentives to create attendance
breaks have generally increased over time as the consequences or poor perormance became
more severe.
As used throughout this report, the term scrubbing entails withdrawing students without
proper documentation or justication. Such withdrawals are reerred to as attendance scrub-
bing because they enable a school to remove or scrub a students poor attendance record.
Another implication o withdrawing students is that their educational records do not count
when calculating school perormance or Ohios accountability systemthat is, their edu-
cational records are rolled up to the state level or accountability purposes. Because student
achievement and attendance are highly correlated, schools that withdraw students with re-
quent absences should benet in terms o higher reported prociency scoreswhether or not
students are withdrawn because o their low scores on State tests.
Strategies or predicting scrubbing could entail, or example, identiying schools that just
attained a designation based on the perormance index, the number o indicators met, thenumber o students in a particular subgroup, or the value-added score. Schools that might
have the greatest incentive to scrub their data are those that stand to nearly miss a higher
designation. Due to the complexity and evolution o Ohios accountability system, however,
identiying schools that just missed a lower designation is perhaps an exceedingly time-inten-
sive task with uncertain benets. As described earlier in this report, the sheer complexity o
the accountability system created incentives or all schools and districts to improve indicators
such as attendance, prociency, and graduation rates, as any positive change on these measures
could prove pivotal in moving rom one AYP designation to another.
The process o creating breaks in enrollment entails admitting or withdrawing students ater
the ocial October Average Daily Membership (ADM) count week. The ollowing are valid
reasons to create a break in enrollment pursuant to Chapter 2 o the 2011 ODE EMIS
Manual:
Code Reason36 WithdrewfromPreschool;Preschoolstudenthaswithdrawnfromthe
preschoolprogram(foranyreason)
37 WithdrewfromKindergarten;Kindergartenstudenthaswithdrawnbecauseithasbeendeemedtobeinthebestinterestofthestudentif
he/shewaitsonemoreyearuntilstartinghis/herkindergarten
experience;mayonlybeusedbystudentsinkindergarten.
40 TransferredtoAnotherSchoolDistrictOutsideofOhio;Transcriptrequestonfile.
41 TransferredtoAnotherOhioSchoolDistrict;Local,ExemptedVillage,orCity,transcriptrequestonfile.
42 TransferredtoaPrivateSchool;Transcriptrequestonfile,i.e.,EdChoicestudents.
43 TransferredtoHomeSchooling;Superintendentsapprovalonfile.45 TransferredbyCourtOrder/Adjudication;IfCourthasdesignateda
publicdistrictotherthanyoursasdistrictresponsibleforpayingforthe
education.TheresidentdistrictshouldnotwithdrawANYstudents
placedintotheDepartmentofYouthServices.
46 TransferredoutoftheUnitedStates
Because student achievementand attendance are highly cor-related, schools that withdrawstudents with requent absencesshould beneft in terms o higherreported profciency scores.
-
7/29/2019 Final AttendanceAudit 02112012
19/127
14Statewide Audit of Student AttendanceData and Accountabil ity System
With regard to truancy, according to the Ohio Rev. Code, schools are permitted to withdraw
students only ater appropriate due process. The statutes provide several procedural steps
which schools must ollow in dealing with violations o the compulsory attendance laws.
Ohio Rev. Code 3321.19 and 3321.20 require schools to give prior warning o the legal
consequences o truancy to the parent or guardian o the truant child. When any child o
compulsory school age is not attending school and is not properly excused rom attendance,
the school must notiy the parent or guardian who must thereater cause the child to attend
the proper school (Ohio Rev. Code 3321.19).
Special provisions o the law apply to any student who is considered to be either a habitu-
ally truant or a chronic truant. Ohio Rev. Code 2151.011 denes habitual truant as a
school-age child who is absent rom school without legitimate excuse or ve or more consecu-
tive days, seven or more days in a school month, or 12 or more school days in a school year.
Ohio Rev. Code 3313.62 denes a school month as our school weeks. Ohio Rev. Code
2151.011 and 2152.02 dene a chronic truant as a school-age child who is absent rom
school without legitimate excuse or seven or more consecutive days, ten or more days in a
school month, or 15 or more days in a school year.
I a parent, guardian, or other custodian o a habitual truant ails to cause the childs atten-
dance at school, the board o education may proceed with an intervention strategy in accor-dance with its adopted policy, may initiate delinquency proceedings, or both (Ohio Rev. Code
3321.19). Each board is required under Ohio Rev. Code 3321.191 to adopt a policy to
guide employees in addressing and ameliorating the habitual truancy o students. I the board
has established an alternative school, assignment to the alternative school must be included in
the policy as an interventions strategy.
Ohio Rev. Code 3321.19 requires that upon the ailure o the parent, guardian, or other per-
son having care o the child to cause the childs attendance at school, i the child is considered
a habitual truant, the board o education o the school district or the governing board o the
educational service center shall do either or both o the ollowing:
Code Reason47 WithdrewPursuanttoYodervs.Wisconsin
48 Expelled
51 VerifiedMedicalReasons;Doctorsauthorizationonfile.52 Death71 WithdrawDuetoTruancy/Nonattendance72 PursuedEmployment/WorkPermit;SuperintendentApprovalonfile.73 Over18YearsofAge74 Moved; Notknowntobecontinuing.
75 StudentCompletedCourseRequirements butdidNOTpasstheappropriatestatewideassessmentsrequiredforgraduation.Inthecase
ofastudentonanIEPwhohasbeenexcusedfromtheindividual
consequencesofthestatewideassessments,usingthiscodeindicates
thatthestudentcompletedcourserequirementsbutdidnottakethe
appropriatestatewideassessmentsrequiredforgraduation.
99 CompletedHighSchoolGraduationRequirements;Studentcompletedcourserequirementsandpassedtheappropriatestatewideassessments
requiredforhighschoolgraduation.InthecaseofastudentonanIEP
whohasbeenexcusedfromtheindividualconsequencesofthe
statewideassessments,
using
this
code
indicates
that
the
student
completedcourserequirementsandtooktheappropriatestatewide
assessmentsrequiredforhighschoolgraduation.
-
7/29/2019 Final AttendanceAudit 02112012
20/127
1Statewide Audit of Student Attendance
Data and Accountabil ity System
1. Take any appropriate action as an intervention strategy contained in the policy devel-
oped by the board pursuant to Section 3321.191 o the Revised Code;
2. File a complaint in the juvenile court o the county in which the child has a resi-
dence or legal settlement or in which the child is supposed to attend school jointly
against the child and the parent, guardian, or other person having care o the child. A
complaint led in the juvenile court under this division shall allege that the child is an
unruly child or being an habitual truant or is a delinquent child or being an habitual
truant who previously has been adjudicated an unruly child or being an habitual tru-
ant and that the parent, guardian, or other person having care o the child has violated
Section 3321.38 o the Revised Code.
Upon the ailure o the parent, guardian, or other person having care o the child to cause the
childs attendance at school, i the child is considered a chronic truant, the board o education
o the school district or the governing board o the educational service center shall le a com-
plaint in the juvenile court o the county in which the child has a residence or legal settlement
or in which the child is supposed to attend school jointly against the child and the parent,
guardian, or other person having care o the child. A complaint led in the juvenile court
under this division shall allege that the child is a delinquent child or being a chronic truantand that the parent, guardian, or other person having care o the child has violated Section
3321.38 o the Revised Code.
Attendance and student perormance are highly correlated.3 Because student achievement
and attendance are highly correlated, schools that withdraw students with requent absences
should benet in terms o higher reported prociency scoreswhether or not students are
withdrawn because o their low scores on State tests.
The perormance measures and the rewards and sanctions associated with Ohios account-
ability system have changed over time. As we describe above, the incentives to withdraw
students with requent absences or low test scores likely increased over time, as the conse-
quences or poor perormance became more severe. Moreover, the students whose atten-dance records schools and districts might have targeted also changed over time. For example,
NCLB increased the stakes o school-level perormance as well as the perormance o student
subgroups. Schools that had too ew students belonging to a student subgroup (less than 30
students) were not held accountable or that subgroups achievement or the purpose o AYP
calculations. Thus, withdrawing just a ew students rom a low- achieving subgroupjust
enough to drop the student count below 30could allow a school to avoid serious adminis-
trative consequences. Because NCLBs AYP ocused on reading and mathematics test results,
schools and districts had especially strong incentives to withdraw students who scored poorly
(or were expected to score poorly) on those tests.
It also is important to understand that the vast majority o schools and districts potentially
stood to gain by improving their test and attendance outcomes, regardless o demographic
3 Reerences:
Roby, Douglas E. Research on School Attendance and Student Achievement: A Study o Ohio Schools.
Educational Research Quarterly available at http://www.eric.ed.gov/PDFS/EJ714746.pdf
Gottried, Michael A. Evaluating the Relationship Between Student Attendance and Achievement in
Urban Elementary and Middle Schools: An Instrumental Variables Approach. American Educational
Research Journal available at: http://69.8.231.237/uploadedFiles/Divisions/School_Evaluation_
and_Program_Development_(H)/Awards/Cat_2_GOTTFRIED_ONLINE_FIRST.pdf
Lamdin, Douglas J. Evidence o Student Attendance as an independent Variable in Education Production
Functions. The Journal o Educational Research available at: http://www.gb.nrao.edu/~sheather/new%20lit/ContentServer.pdf
http://www.eric.ed.gov/PDFS/EJ714746.pdfhttp://69.8.231.237/uploadedFiles/Divisions/School_Evaluation_and_Program_Development_(H)/Awards/Cat_2_GOTTFRIED_ONLINE_FIRST.pdfhttp://69.8.231.237/uploadedFiles/Divisions/School_Evaluation_and_Program_Development_(H)/Awards/Cat_2_GOTTFRIED_ONLINE_FIRST.pdfhttp://www.gb.nrao.edu/~sheather/new%20lit/ContentServer.pdfhttp://www.gb.nrao.edu/~sheather/new%20lit/ContentServer.pdfhttp://www.gb.nrao.edu/~sheather/new%20lit/ContentServer.pdfhttp://www.gb.nrao.edu/~sheather/new%20lit/ContentServer.pdfhttp://69.8.231.237/uploadedFiles/Divisions/School_Evaluation_and_Program_Development_(H)/Awards/Cat_2_GOTTFRIED_ONLINE_FIRST.pdfhttp://69.8.231.237/uploadedFiles/Divisions/School_Evaluation_and_Program_Development_(H)/Awards/Cat_2_GOTTFRIED_ONLINE_FIRST.pdfhttp://www.eric.ed.gov/PDFS/EJ714746.pdf -
7/29/2019 Final AttendanceAudit 02112012
21/127
16Statewide Audit of Student AttendanceData and Accountabil ity System
characteristics and achievement levels. Ohios mechanism or scoring school perormance pro-
vided a number o (airly complicated) ways o reaching various publicized designations. As a
result, rom a school or district perspective, improvement on any report card indicator could
be pivotal (e.g., in demonstrating the type o improvement associated with NCLBs sae har-
bor provision, schools and districts could avoid having to meet a prociency level i sucient
improvement was shown). And there have been rewards and sanctions associated with each
o these potential designations, ranging rom public shaming and levy problems to State andFederal rewards and sanctions.
Thus, there are three general eatures o the accountability system to emphasize. First, the
incentives to scrub attendance data generally increased over time. Second, the sheer complex-
ity o the system meant that any attendance scrubbing could be seen as potentially pivotal in
reaching important perormance thresholds, regardless o a schools demographic and educa-
tional characteristics. Third, school personnel need not be particularly calculating to benet
rom withdrawing students with poor attendance or poor academic perormance. Withdraw-
ing a student with requent absences, or example, has always stood to improve a schools
designationespecially as the complexity o determining Ohios perormance ratings, as well
as the stakes o these ratings, have increased.
7. SUPPORT ROLES IN ACCOUNTABILITY
Role of ODE
Pursuant to Ohios organizational structure, ODE should ensure compliance with statewide
policy by outlining accountability and other requirements o Federal and State laws so that the
State, districts, school, and school boards can incorporate these requirements into their am-
ily involvement policies. In this role, ODE should communicate policy to districts, schools,
school boards and stakeholder groups; monitor districts or compliance; and provide support
and inrastructure or continued implementation o Federal and State amily and community
engagement policies.
ODE also provides expert technical assistance and support to acilitate the development and
continuous improvement o programs or school, amily and community partnerships.
As described in ODEs Recommended Roles and Responsibilities or Supporting School, Fam-
ily, and Community Partnerships, ODE should:
Provide adequate sta to monitor compliance o Federal and State laws and policies;
Secure adequate unding or supporting State-level goals and provide guidance or
district allocation o unding;
Allocate unds or sta to develop tools and resources, and to conduct compliance
reviews; and
Provide guidance to districts in the use o ederal entitlement unds, State unds and
other unding sources available or supporting school, amily and community partner-
ships.
As described earlier in this report, EMIS is ODEs primary system or collecting student,
sta, course, program, and nancial data rom Ohios public schools. The data collected via
EMIS are used to determine both State and Federal perormance accountability designations,
produce the local report cards, calculate and administer State unding to school districts,
determine certain Federal unding allocations, and meet Federal reporting requirements. The
Withdrawing a student withrequent absences has alwaysstood to improve a schoolsdesignationespecially as thecomplexity o determiningOhios perormance ratingshave increased.
-
7/29/2019 Final AttendanceAudit 02112012
22/127
1Statewide Audit of Student Attendance
Data and Accountabil ity System
data collected through EMIS provide the oundation or Ohios-soon-to-be developed P-20
Statewide Longitudinal Data System, intended to meet all o the America COMPETES Act
elements. Also, ODE launched a newly redesigned EMIS system (EMIS-R) in January 2012.
EMIS-R is intended to provide enhanced system unctionality that will improve the timeliness
and quality o the data while simpliying the process.
Role of Information Technology Centers andStudent Information System Vendors
There are 23 governmental computer service organizations serving more than 1,000 educa-
tional entities and 1.8 million students in the State o Ohio. These organizations, known as
Inormation Technology Centers (ITCs), and their users make up the Ohio Education Com-
puter Network (OECN) authorized pursuant to 3301.075 o the Revised Code.
ITCs provide inormation technology services to school districts, community schools, joint
vocational schools (JVS)/career & technical, educational service centers (ESCs) and parochial
schools; however, not all schools subscribe to the same services. Thereore sotware applica-
tions can vary between schools, even i they are members o the same ITC.
As noted earlier, not all schools use an ITC. Typically larger school districts, such as Colum-
bus CSD and Cleveland MCSD, maintain their own in-house data centers.
Schools use Student Inormation System (SIS) sotware applications to electronically man-
age student data. There are approximately 26 dierent SIS applications developed by various
vendors used by schools in the State o Ohio. SIS applications are sometimes distributed by an
ITC, but not always. Some schools contract with a vendor directly to obtain a SIS application
or develop their own SIS in house. SIS applications are used to electronically store inorma-
tion related to:
Student demographics
Student scheduling
Student attendance
Student registration/enrollment
Student withdrawal
Student grades
Student test scores
8. USE OF REPORTS AND OTHER DATA SOURCES
To complete this report, auditors gathered and assessed data rom the selected school districtsand conducted interviews with USDOE, ODE, ITCs, SIS vendors, and district personnel.
Data rom external sources, such as the SIS vendors, were not examined or reliability.
Auditors also used the ollowing governing sources to assist in our review:
Federal Family Educational Rights and Privacy Act (FERPA), 20 U.S.C. 1232(g)
Individuals with Disabilities Education Act (IDEA), (Pub. L. No. 108-446; 20 USC
1400 et seq.)
No Child Let Behind Act o 2001 (amendingTitle I, Part A, Elementary and Second-
ary Education Act (ESEA), 20 USC 6301 through 6339 and 6571 through 6578)
-
7/29/2019 Final AttendanceAudit 02112012
23/127
18Statewide Audit of Student AttendanceData and Accountabil ity System
American Recovery and Reinvestment Act (ARRA)
Title I program regulations at 34 CFR part 200
2011 OMB Compliance Supplement
The Education Department General Administrative Regulations (EDGAR) at 34 CFR
parts 76, 77, 81, 82, 98, and 99
Certain requirements o 34 CFR part 299 (General Provisions)
ODE 2011 EMIS Manual
Ohio Revised Code
9. METHODOLOGY
Report card data is submitted to ODE by each school district. The report card data is
ltered through a special set o ODE business rules used to get the most accurate data or
the accountability calculations. For example, the FAY rule limits the set o students whose
data is used in the prociency calculations to those who have been in the school or district
the majority o the year. In most schools and districts, this is a subset o the students that are
actually enrolled on testing day. When trying to show the instructive eectiveness o a school
or district, it makes sense to limit the population to those students who were actually in the
school or district the majority o the year. Many other ODE business rules are also applied to
get the data that best represent what is happening in each school and district.
The data on a school or districts report card is reported to ODE through EMIS (Education
Management Inormation System) by the districts EMIS coordinator over a series o report-
ing periods throughout the year. ODE does not require school districts in Ohio to utilize any
particular SIS, nor does ODE establish minimum requirements or SIS. There are several
SIS vendors throughout the State. The majority o data or the local report cards is submit-
ted over the course o eight weeks during the summer. The data is extracted rom the schoolsstudent inormation systems (SIS) and sent to ODE through the school districts Inormation
Technology Center (ITC) or the districts own data center i they do not have a contracted ser-
vice agreement with an ITC. New data can be sent each week i districts choose. Each week
ollowing data submission, a series o data verication reports are sent rom ODE to district
EMIS coordinators and ITCs. These reports are intended to help EMIS coordinators and
ITCs ensure that the data was uploaded accurately and successully. However, in practice, due
to the act the projections in the Secure Data Center show a schools and districts designations
without the value-added component, which can only improve a school or districts designation,
these reports provide schools and districts with incentive and opportunity to scrub their at-
tendance and enrollment data submissions to improve report card results.
Amid these concerns and ater irregular enrollment and attendance practices were discovered
in the Columbus, Toledo, and Lockland school districts, the AOS initiated a statewide analysis
o school attendance records to determine whether Ohio schools scrubbed attendance data
and whether other problems existed in the EMIS reporting process.
AOS perormed the ollowing procedures or each o the selected schools or districts:
Reviewed schools enrollment, attendance, and withdrawal policies and practices. Each
board is required under Ohio Rev. Code 3321.191 to adopt a policy to guide employ-
ees in addressing and ameliorating the habitual truancy o students. For example, i the
board has established an alternative school, assignment to the alternative school must
be included in the policy as an interventions strategy.
-
7/29/2019 Final AttendanceAudit 02112012
24/127
1Statewide Audit of Student Attendance
Data and Accountabil ity System
Traced breaks in student enrollment and other reasons or rolling the student to the
State to supporting records to determine reasonableness and timeliness o the inor-
mation being entered into the districts SIS. Pursuant to ODEs 2011 EMIS Manual
Chapter 2, Student Data, supporting attendance records should include, but not be
limited to:
o
Notes and other verication inormation relative to excused absences andtardiness;
o Authorized medical excuses;
o Expulsion notications to students and parents or guardians;
o Telephone and meeting logs describing nature and timing o contact with
student parents or guardians and reasons or absence;
o Notices to parents, guardians, and truancy ocers demonstrating due
process under Ohio Rev. Code 3321.191 and the board-approval truancy
policies;
o Court and parent/guardian orders or student placement in homes or
institutions;
o Transcript requests rom other school districts supporting student mobility;
o Evidence that the student completed course requirements but did not take
the appropriate statewide assessments required or graduation;
o Evidence that the student is 18 years old and no longer under the purview
o the Compulsory Education Act; and
o Other source documents such as lists o Limited English Procient (LEP)students, students in open enrollment, students attending classes at an Edu-
cational Service Center (ESC), Career Technical Planning District (CTPD),
or Joint Vocational School (JVS), and students enrolled in Post-Secondary
Enrollment Options (PSEO).
All excuses rom parents, and other documents, regardless o ormat or condition, become o-
cial attendance records. Ohio Rev. Code 3317.031 requires the, membership record shall
be kept intact or at least ve years and shall be made available to the State Board o Education
or its representative in making an audit o the average daily membership or the transportation
o the district. Membership record encompasses much more than just attendance records.
As dened in statute, it includes: name, date o birth, name o parent, date entered school,
date withdrawn rom school, days present, days absent, and the number o days school was
open or instruction while the pupil was enrolled.
-
7/29/2019 Final AttendanceAudit 02112012
25/127
20Statewide Audit of Student AttendanceData and Accountabil ity System
9.1. JOHN GLENN SCHOOL OF PUBLIC AFFAIRS,
THE OHIO STATE UNIVERSITY
In conducting this audit, the AOS worked extensively with The John Glenn School o Public
Aairs at The Ohio State University to develop statistical procedures and data management
strategies in support o audit goals. The AOS conducted its testing o student attendance dataand accountability in three phases, as described below:
Phase One
AOS reported on Phase One in its Interim Report on Student Attendance Data and Accountabil-
ity System dated October 4, 2012. For this rst phase, AOS initially selected 100 schools with
the highest number o students that took the State assessments and whose test scores were
subsequently rolled up to the State based on a break in enrollment or change in the WKC.
However, AOS noted two districts, Columbus City School District and Cleveland Municipal
City School District, had a large number o schools included in the initial selection. In an
eort to achieve more diverse coverage in Ohio schools selected or initial testing, AOS nar-
rowed the schools in the Columbus CSD and Cleveland MCSD to only ten and 15 schools,
respectively, based on the schools with the greatest number o students rolled up to the States
report card. Furthermore, AOS selected an additional 28 school districts to include in its
testing sample. The goal o the rst phase o testing was to obtain a general understanding o
how the EMIS system operates and how schools might use breaks in enrollment to improve
report card results. The data collected rom this testing was used in later phases to determine
the most eective and ecient testing approach.
Phase Two
AOS reported on Phase Two in its Interim Report on Student Attendance Data and Account-
ability System dated October 23, 2012. The goal o the student attendance reviews was toensure compliance with Ohios accountability system. Obviously, no matter how competent
the auditor or how sophisticated the schools student inormation system and enrollment
processes, reviewing each students enrollment documentation or all schools is a physical
impossibility. Even i 100 percent o Ohios tested students rolled up to the State report card
could be examined, the cost o testing would likely exceed the expected benets (the assurance
that accompanies examining 100 percent o the total) to be derived. The cost per student
le examined was approximately $30 as o October 23, 2012. Because o this cost-benet
challenge, AOS applied widely utilized sampling techniques, discussed below, and contracted
with the Ohio State University (OSU) or expert statistical consulting services in an attempt
to develop meaningul statistical predictors or the balance o its work.
Te Ohio State Universitys Statistical Analysis
AOS requested that OSU balance two goals: 1) the identication o schools that are more
likely than others to be scrubbing attendance data and 2) the generation o a data set that aid
in uncovering statistical predictors o scrubbing. To achieve these goals, OSU perormed the
ollowing:
Reviewed key eatures o Ohios accountability system and the associated incentives or
scrubbing attendance data;
Identied some school and district data that AOS might consider in selecting schools
to examine;
-
7/29/2019 Final AttendanceAudit 02112012
26/127
2Statewide Audit of Student Attendance
Data and Accountabil ity System
Provided details o a sampling procedure that AOS used to identiy schools that are
more likely to engage in data scrubbing and acilitate the identication o predictors o
attendance scrubbing.
Due to AOS time and resource constraints, the OSU recommended sampling procedure
emphasized the identication and analysis o publicly available data that were relatively easy
to gather and that provided sorting inormation that could be valuable in light o the incen-tives introduced by Ohios accountability system. Specically, OSU recommended identiying
schools with unusually large changes in their reported attendance and mathematics prociency
rates between the 1995-1996 and 2010-2011 school years, as well as those with an unusually
large proportion o their students whose scores were rolled up to the State level during the
2010-2011 school year, through comparisons with similar schools (in terms o tested grade
levels and district demographic characteristics). Thus, the statistically rigorous OSU strategy
entailed identiying schools that had unusual roll-up rates and unusual gains in their atten-
dance and prociency rates.
AOS used this ranking to select a sample o schools with levies on the November ballot or
Phase Two o the student attendance testing. AOS excluded schools previously examined inthe rst phase o the attendance review rom the second-phase levy schools sample. As a re-
sult, AOS examined 81 schools rom 47 school districts out o a statewide total o 184 school
districts with levies on the November ballot.
This strategy had a number o advantages over alternativeperhaps more involvedones.
First, given the incentives o Ohios accountability system, the math, attendance, and roll-up
measures were expected to help identiy schools that were scrubbing data or the purpose o
improving reported perormance statistics. Second, ocusing on within-school perormance
changes over time, as well as characterizing the unusualness o school perormance with com-
parisons to schools in similar districts and with similar tested grades, helped stratiy the sample
o schools so that it was representative o Ohios diversity. This second eature was important
or generating a school-level dataset that helped AOS identiy statistical predictors o scrub-
bing to be used in the nal phase o the examination. Last but not least, the timely exami-
nation o schools with levies on the November ballot aided the public in making inormed
voting decisions.
Te Ohio State University Recommendation or Identiying Unusual Roll-up Rates
AOS selected the rst 100 schools to examine in Phase One based, in large part, on their
2010-11 school year withdrawal rates or tested students. Specically, AOS identied the
percentage o tested students whose scores were rolled up to the State level due to the student
being withdrawn. This indicator is closely tied to the attendance scrubbing practices that are
the ocus o the examination. Given the goals o the AOS school sampling strategy or Phase
Two (described above), OSU recommended AOS identiy schools with unusual roll-up rates
compared to other schools serving similar grades (i.e., elementary, middle, and high schools)
and that reside in similar districts (as per ODEs seven-category district typology). OSU and
AOS expected this strategy to help account or the correlation between student mobility and
school and district types.
Te Ohio State Universitys Recommendation or Analyzing Relative Attendance
Rate Gains over ime
As discussed above, withdrawing students with requent absences could enhance perormance
on consequential report card indicators. Assuming that withdrawals indeed increase atten-
Assuming that withdrawals in-crease attendance rates, lookingor unusually large increases overtime in school attendance rates
is one way o identiying schoolsor urther study.
-
7/29/2019 Final AttendanceAudit 02112012
27/127
22Statewide Audit of Student AttendanceData and Accountabil ity System
dance rates, looking or unusually large increases over time in school attendance rates is one
way o identiying schools or urther study.
Whereas attendance rates may not have been calculated identically over time, OSU indicated
that this variability should not pose too severe a problem or AOS purposes. That is because
the quantity o interest is the relative attendance changes across schools. What is necessary
to ensure comparability is that any changes made to the attendance ormula aects schools
similarly rom year to year. Thus, while the absolute changes in attendance calculated may be
invalid in terms o identiying trends in attendance rates, the relative changes in attendance
rates are likely to capture the schools and districts with relatively unusual changes in atten-
dance rates over time.
Another potential complication is that schools that include dierent grades have dierent stu-
dent populations. One might expect more or less signicant incentives to increase attendance
rates depending on the student population at hand. The greatest gains might occur where at-
tendance problems are the greatestor example, urban high schools, as opposed to suburban
elementary schools. On the other hand, attendance rates gure directly into elementary and
middle school AYP calculations, whereas the graduation rate is used in high schools. Stratiy-
ing by school and district type and then ranking schools by attendance gains was the optionOSU recommended or addressing such issues.
Te Ohio State Universitys Recommendation or Analyzing Relative Mathematics
Profciency Gains over ime
According to the results o the AOS Phase One examination, a potential purpose o withdraw-
ing students was to increase the percent o students achieving procient designations at the
school and district levels. Student test scores are highly correlated with one another and some
test subjects have gured more prominently in Ohios accountability system, so OSU recom-
mended ocusing on a single tested subject: mathematics. Mathematics has played a promi-
nent role in all our o Ohios perormance calculations and the availability o mathematics
prociency data met the requirements o the proposed analysis.
State testing has changed signicantly over time. For example, mathematics tests were admin-
istered in the 4th, 6th, 9th, and 12th grades in the late 1990s. Today, they are administered
in grades 3 through 8, as well as in grade 10. Additionally, the type o tests administered
(and the academic standards on which they are based) changed. For example, the original
prociency tests were replaced with criterion- reerenced assessments in order to comply with
changes in State and Federal law. Finally, the cut scores that identiy student prociency also
were adjusted. Thus, school perormance ratings may have gone up or down simply because o
changes in the testing and accountability system.
OSU elt that the changes in the cut scores and tests administered probably were not too
problematic or AOS purposes. That is because, as with the attendance rate change calcula-
tion, the quantities o interest are the relative rate changes among schools, rather than absolute
changes. However, the variation in tested grades across schools and over time are potentially
problematic. Schools with dierent tested grades may have aced relatively lower or higher
prociency bars over time simply because o changes in testing. One partial solution was to
identiy tests administered in all years since the 1998-99 school year and to compare achieve-
ment gains in schools that include the same tested grades. In particular, mathematics pro-
ciency rate data were available or grades 4, 6, and 10 or all years since the 1998-99 school
year. OSU recommended comparing prociency rate changes or schools that had the same
highest tested grades (e.g., compare 4th grade mathematics prociency gains or schools whose
-
7/29/2019 Final AttendanceAudit 02112012
28/127
2Statewide Audit of Student Attendance
Data and Accountabil ity System
highest o the three listed grades is the 4th grade), as withdrawing students is more likely to
pay dividends as schools deal with students in higher grades.
It is worth noting that, like the attendance measure described above, examining mathematics
prociency gains is ar rom a perect strategy. Math prociency rates are not perect determi-
nants o school designations and the possibility o rewards and sanctions. In addition, schools
varying circumstances aect the extent to which OSU and AOS can characterize prociency
gains as unusual. The OSU-recommended school sampling strategy entails accounting or
district demographics or this reason. And, as mentioned above, looking at rate changes also
helped account or variation in school circumstances.
Te Ohio State University Recommendation or School Sampling Generating a
Representative Sample
As described earlier, OSU recommended that examining the unusualness o changes in
schools attendance and mathematics prociency rates, as well as the unusualness o schools
withdrawal rates or tested students, could help in identiying schools that scrub data in order
to improve perormance on Ohios report cards. The AOS was also interested in sampling
schools so that statistical predictors o scrubbing may be identied and valid inerences may bedrawn regarding the scope o scrubbing across Ohios diverse