Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random...

62
U.S. Department of Justice Office of Justice Programs Bureau of Justice Statistics Criminal Justice Information Policy Assessing Completeness and Accuracy of Criminal History Record Systems: Audit Guide

Transcript of Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random...

Page 1: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

U.S. Department of JusticeOffice of Justice ProgramsBureau of Justice Statistics

Criminal Justice Information Policy

Assessing Completenessand Accuracy

of Criminal HistoryRecord Systems:

Audit Guide

Page 2: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

U.S. Department of JusticeOffice of Justice ProgramsBureau of Justice Statistics

Criminal Justice Information Policy

Assessing Completenessand Accuracy

of Criminal HistoryRecord Systems:

Audit Guide

January 1992, NCJ-133651

Page 3: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

ii

U.S. Department of JusticeOffice of Justice ProgramsBureau of Justice Statistics

Steven D. Dillingham, Ph.D.Director

Acknowledgments

This report was prepared by SEARCH, The National Consortium for Justice Information andStatistics, Gary L. Bush, Chairman, and Gary R. Cooper, Executive Director. The report wasprepared by Paul L. Woodard, Senior Counsel. The project director was Sheila J. Barton, Director,Law and Policy Program. Twyla R. Cunningham, Manager, Corporate Communications, editedthe report, with assistance from Seth F. Jacobs, Director, Research and Statistics Program. Jane L.Bassett, Publishing Assistant, provided layout and design assistance. The federal project monitorwas Carol G. Kaplan, Chief, Federal Statistics and Information Policy Branch, Bureau of JusticeStatistics.

Report of work performed under BJS Grant No. 87-BJ-CX-K079, awarded to SEARCH Group,Inc., 7311 Greenhaven Drive, Suite 145, Sacramento, California 95831. Contents of thisdocument do not necessarily reflect the views or policies of the Bureau of Justice Statistics or theU.S. Department of Justice.

Copyright © SEARCH, The National Consortium for Justice Information and Statistics 1991.

The U.S. Department of Justice authorizes any person to reproduce, publish, translate or otherwiseuse all or any part of the copyrighted material in this publication with the exception of those itemsindicating they are copyrighted or printed by any source other than SEARCH, The NationalConsortium for Justice Information and Statistics.

The Assistant Attorney General is responsible for matters of administration and management withrespect to the OJP agencies: the Bureau of Justice Assistance, Bureau of Justice Statistics,National Institute of Justice, Office of Juvenile Justice and Delinquency Prevention and the Officefor Victims of Crime. The Assistant Attorney General further establishes policies and prioritiesconsistent with the statutory purposes of the OJP agencies and the priorities of the Department ofJustice.

Page 4: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

iii

Table of Contents

PART I INTRODUCTION................................................................................................1

A. Overview ....................................................................................................1

B. Approach of the Audit Guide ........................................................................1

PART II COMPLETENESS AND ACCURACY REQUIREMENTS................................3

PART III IN-HOUSE ACCURACY AND COMPLETENESS CHECKS AT THE REPOSITORY ..................................................................................5

A. Analyzing the Criminal History Database......................................................5Goal ............................................................................................................5Procedures ..................................................................................................5

Review Error Lists or Other Reports .......................................................5Conduct a Programmed Analysis of the Criminal History Database.........................................................................................................................................5Conduct a Manual Analysis of Database Entries or Documents........................6

B. Comparing the Database with Externally Obtained Information......................7Goal ............................................................................................................7Procedures ..................................................................................................7 Assess Arrest Reporting Levels...............................................................................................7 Assess Reporting by Other Agencies ...................................................................................7

C. Comparing the Database with Stored Source Documents................................8Goal ............................................................................................................8Procedures ..................................................................................................8

D. Auditing by Mail .........................................................................................9Goal ............................................................................................................9Procedures ..................................................................................................9

PART IV SITE VISITS TO REPORTING AGENCIES....................................................11

A. Selecting Agencies to be Audited................................................................12Goal ..........................................................................................................12Procedures ................................................................................................12

B. Selecting Sample Records ..........................................................................13Goal ..........................................................................................................13Procedures ................................................................................................13

Devise a Selection Methodology.........................................................................................13Generate Sample Case Lists..................................................................................................14Select a Sample Size.............................................................................14

C. Finalizing the Audit Methodology ..............................................................................................23Goal ..........................................................................................................23Procedures ................................................................................................23

Develop Data Collection Forms ...........................................................23

Page 5: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

iv

Develop Audit Questionnaires ..............................................................26Create an Audit Database ....................................................................28Prepare an Audit Manual.....................................................................28

D. Completing Pre-Audit Planning ...................................................................................................29Goal ..........................................................................................................29Procedures ................................................................................................29

Develop an Audit Schedule ...................................................................29Contact Agencies to be Audited ............................................................29• Initial Contacts .....................................................................................................................29• Audit Notice Letter ............................................................................................................29Prepare Audit Folders .........................................................................30• Case Folders...........................................................................................................................30• Agency Folders ....................................................................................................................30

E. Conducting the Audits .......................................................................................................................30Goal ..........................................................................................................30Procedures ................................................................................................30

Conduct an Entry Interview..................................................................30Tour the Agency’s Record Operations...................................................31Validate Sample Records......................................................................31Locate Appropriate Files......................................................................31Review Extra-agency Papers ................................................................31Obtain Copies of Audit Documents .......................................................32Select Cases for Reverse Audit..............................................................32Conduct an Exit Interview ....................................................................32

PART V PREPARING AUDIT REPORTS .....................................................................33

Goal ..........................................................................................................33Procedures ................................................................................................33Describe Agency Procedures .........................................................................................................33Present Audit Findings...............................................................................33Set Out Recommendations ..........................................................................33Prepare Multiple Reports ...........................................................................34

FIGURES

1 Statistical Sampling Table Excerpt.......................................................................172 Statistical Sampling Table Excerpt.......................................................................193 Statistical Sampling Table Excerpt.......................................................................204 Statistical Sampling Table Excerpt.......................................................................225 Sample Data Collection Form Excerpt .................................................................246 Sample Data Collection Form — Arrest Information.............................................27

APPENDICES

I Sample Audit Data Collection Form — Jurisdictional Basis ..................................36II Sample Audit Data Collection Forms — Local Agency Basis

State’s Attorney Disposition ..........................................................................45Court Initiation and Disposition .....................................................................46Custodial Receipt/Status Change ...................................................................47

III Sample Audit Questionnaire ................................................................................49IV Sample Audit Notice Letter .................................................................................60

Page 6: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 1

Part I

Introduction

A. Overview

The purpose of this Audit Guide isto provide assistance to State officialsand others in planning andconducting audits to assess thecompleteness and accuracy of thecriminal history record databasesmaintained by State centralrepositories. The Guide will alsohelp State repositories assesscompliance by State and localcriminal justice agencies withstatutory reporting and other legalrequirements. The Guide describesmethods of:

(1) Auditing data quality levels ofthe central repository databaseby in-house methods and bymethods that require referenceto externally obtainedinformation, such as validatingsample repository records bycomparison with originalrecords of entry maintained bycontributing agencies; and

(2) Auditing contributing criminaljustice agencies, either as a partof an audit of the repositorydatabase or as part of anongoing local agency auditprogram.

The Guide deals in detail onlywith data quality auditing. It doesnot describe specific methods ofauditing compliance with otheraspects of recordhandling, such aslimits on dissemination, securityrequirements or record subjectaccess/review procedures. However,some sections of the Guide, includingthose that deal with pre-auditplanning, selecting agencies to beaudited, scheduling and conductingon-site audits, and structuring auditreports, are applicable to any kind ofcriminal history record audit.Agency officials who want todevelop an audit program to police

compliance with all aspects ofrecordhandling policy should find theGuide useful in planning the overallaudit methodology and designingspecific audit techniques and datacollection methods for auditing dataquality. Audit techniques for otheraspects of recordhandling can befound in other publications.1

B. Approach of theAudit Guide

As noted, the Guide describesaudit methods that can be used toconduct an assessment of thecompleteness and accuracy of theState central repository database (ordesignated segments of thatdatabase), as well as methods forauditing selected State and localagencies that report information tothe repository. The techniques forauditing reporting agencies can beused as part of a repository databaseaudit, if such an audit includes localagency site visits to validate samplerepository records by comparingthem with original records of entrymaintained at the agency level. Thetechniques also can be used toconduct a program of ongoingcompliance audits of local agencies,as required by Federal regulations2

and the provisions of some Statelaws.

Part II of the Guide describescompleteness and accuracyrequirements, including those set outin Federal regulations and guidelines

1 See, for example, SEARCH Group,Inc., Audit Guide for CriminalHistory Records Systems(Sacramento, Calif.: SEARCHGroup, Inc., December 1982).

2 28 C.F.R. § 20.21(e).

and those established by State laws orregulations. Part III describesmethods of auditing repository dataquality levels by in-house analysis orby other means that do not requireon-site audits of reporting agencies.Included are suggested methods of(1) conducting manual or computeranalysis of the repository database;(2) comparing database entries withsource documents stored at therepository; (3) comparing databaseentries with externally obtained caseprocessing lists or statistical data; and(4) conducting auditing by mail. PartIV describes methods of conductingon-site audits of State and localagencies that report information tothe repository. This section of theGuide sets out procedures for (1)selecting agencies to be audited; (2)selecting sample records to bevalidated by reference to originalsource documents; (3) formulating anaudit methodology; (4) completingnecessary pre-audit tasks; and (5)conducting the on-site audits. Part Vdescribes the preparation of auditreports and suggests ways ofreducing the burden of preparingmultiple audit reports for ongoinglocal agency audit programs.

The basic purpose of a data qualityaudit is to determine the extent towhich criminal justice transactionsthat are required to be reported to acentral criminal record repository —for example, arrests, courtdispositions, correctional receptionand release — are fully andaccurately reported in a timelymanner and are accurately enteredinto the repository database. Some ofthese elements of data quality can beassessed at the repository, throughaudit methods described in the Guide,without reference to official recordsmaintained at reporting agencies. Forexample, the accuracy of data entryprocedures at the repository can be

Page 7: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

2 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

assessed by comparing sample recordentries drawn at random from thecriminal history database withincoming reporting documents orcomputer tapes stored at therepository. Timeliness of reportingcan be assessed by comparing thedates on which reported transactionsoccurred with the dates on which thereported information was received atthe repository, if reception dates arelogged, or the dates on which theinformation was processed at therepository, if these dates are logged.As a final example, analysis of therepository database by manual orautomated methods can identifyinstances in which reportabletransactions apparently occurred andwere not reported — such as an arrestentry for which no disposition wasreceived within a designated periodor an entry showing a courtconviction and a sentence toimprisonment not followed by areported correctional segment withinan appropriate period.

However, it is not possible todetermine by such methods whetherall reportable transactions were infact reported fully and accurately.Such determinations requirereference to official sourcedocuments maintained by reportingagencies, such as arrest logs,prosecutor files, court dockets andother case files. The most reliablemethod of making suchdeterminations is by conducting sitevisits to selected reporting agenciesto examine their records and assessthe adequacy of their reportingprocedures. This method of auditing,described in detail in Part IV, ishighly accurate for two reasons: first,the auditor can establish thatparticular reportable transactionsactually occurred; second, bycomparing the official agency recordsof such transactions with repositoryrecords, the auditor can determinewhether the transactions werereported fully and accurately and in atimely manner. This type of auditingis expensive, however, particularly ifenough records are reviewed to yieldstatistically significant results — that

is, to establish data quality statisticsof a known degree of accuracy andreliability, as opposed to estimates ofunknown accuracy. For example, ifthe goal of a particular audit is tomeasure the accuracy andcompleteness of a designatedsegment of the repository database(such as all felony arrests andconvictions occurring in the Stateduring a designated period) and ifstatistically significant audit resultsare desired, the audit methodologymay need to include selection of arather large random sample of suchcases followed by site visits to all ofthe reporting agencies involved inprocessing the cases. In this way,auditors can determine whether allrequired information was reportedand whether it is accurately reflectedin the repository database.

While statistically significant auditresults may be necessary for somepurposes, less accurate (and lessexpensive) audits may suffice forother purposes, such as for making ageneral assessment of data qualitylevels for planning purposes. Even ifmore accurate audit results aredesired and reference to local agencyrecords is deemed necessary toachieve such results, it may bepossible to devise a workable auditmethodology that does not requiresite visits to every single agencyselected for inclusion in the audit.For example, it may be possible toobtain, by mail or other means,copies of appropriate portions ofnecessary agency source records,such as copies of arrest bookingsheets for particular dates or copiesof designated pages of court docketbooks. These copies can be used forpurposes of determining whether thedocketed transactions were accuratelyreported to the repository. It mayeven be possible, depending on thelevel of local agency cooperation thatcan be achieved, to persuade agencyrecord clerks or other appropriatepersonnel to perform recordvalidations of sample repositoryrecords and to mail in the results ondata collection forms provided by theauditor. Finally, it may be possible to

obtain lists of reportable transactionsfrom local agencies that can be usedto determine whether all of the listedtransactions were reported to therepository. These and other suchaudit methods are described in theGuide.

In summary, the Guide describes awide range of audit methods that canbe utilized singly or in combinationin a particular audit, depending uponthe goals of the audit, the desiredaccuracy of audit results, theresources available to the auditorsand other such considerations.Factors that affect the reliability ofaudit results and other factors thataffect the design of an audit approachand methodology are discussed inmore detail in the substantive parts ofthe Guide.

Page 8: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 3

Part II

Completeness and Accuracy Requirements

One of the first steps in planningan audit is to develop a list ofapplicable legal requirements andother standards or requirementsapplicable to the central repositoryand contributing local agencies.These requirements may be based onFederal laws or regulations, on Statelaws or regulations, or on otherstandards or requirements.

Federal regulations require allcriminal justice agencies that havereceived U.S. Department of Justicefunding for information systems3 toestablish operational procedures to“[i]nsure that criminal history recordinformation is complete andaccurate.”4 This provision of theregulations goes on to state thatcomplete records should bemaintained at a central Staterepository and that any suchrepository record which containsinformation that an individual hasbeen arrested “must containinformation on any dispositionsoccurring within the State within 90days after the disposition hasoccurred.”5 The provision defines“accurate” as meaning that “norecord containing criminal historyrecord information shall containerroneous information”6 and provides

3 This means funding provided afterJuly 1, 1973, by the LawEnforcement AssistanceAdministration or its successoragencies pursuant to the OmnibusCrime Control and Safe Streets Actof 1968, 42 U.S.C. §§ 3711 et seq., asamended.

4 28 C.F.R. Part 20, § 20.21(a).

5 28 C.F.R. Part 20, § 20.21(a)(1).

6 28 C.F.R. Part 20, § 20.21(a)(2).

that, to accomplish this end, criminaljustice agencies:

[S]hall institute a processof data collection, entry,storage and systematicaudit that will minimizethe possibility ofrecording and storinginaccurate information andupon finding inaccurateinformation of a materialnature, shall notify allcriminal justice agenciesknown to have receivedsuch information.7

Thus, rather than establishingspecific minimum data qualitycriteria that would be deemedacceptable (other than the 90-daydisposition recording requirement),the Federal regulations establish agoal of accuracy and completenessand require State repositories andcontributing agencies to implementoperational procedures, includingreporting, data entry and systematicaudit procedures, designed to achievethat goal — that is, to ensuremaximum completeness and minimumerrors in the repository database.

Pursuant to this Federalrequirement, most of the States haveestablished at least some data qualitystandards and procedures, while otherStates have establishedcomprehensive standards. State lawsand regulations set out specificstandards and requirements in suchareas as the following:

• The types of offenses for whichfingerprints and arrest/chargeinformation must be reported tothe repository by lawenforcement agencies;

7 Ibid.

• The types, as well as the contentand form, of case dispositioninformation relating to reportableoffenses that must be reported byprosecutors, courts, correctionalagencies and other criminaljustice agencies;

• The timeframes within whichsuch information must bereported;

• The content and format of thecriminal history record transcriptand other inquiry responsesprovided by the repository; and

• Specific data quality procedures,such as disposition monitoringprocedures, error notificationprocedures, inquiry-before-dissemination procedures andprocedures for review andcorrection by record subjects.

In developing a list of applicablecompleteness and accuracyrequirements for audit purposes, theauditor should obtain and carefullyreview all such State laws andregulations, as well as relevantreporting forms, instruction manuals,code tables and criminal historyrecord output formats. Not only willthese materials enable the auditor toestablish specific data qualitystandards against which to assessagency performance, they will alsoassist in the development of an auditmethodology and the structure of datacollection forms and other auditdocuments, as explained in Part IV ofthe Guide.

In addition, Federal grantguidelines issued pursuant to recentFederal laws have establishedspecific data quality standards andreporting requirements that may needto be reflected in the auditmethodology, depending upon thepurpose and scope of particularaudits. For example, a 1990

Page 9: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

4 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

amendment to the Omnibus CrimeControl and Safe Streets Act requiresthe States to allocate at least fivepercent of their annual formula grantfunds to data quality improvements,specifically including theidentification and flagging of felonyarrests and convictions.8 Anotheramendment to the Act requires theStates to provide “notice” of alienconvictions and upon request,certified copies of conviction records,to the Immigration and NaturalizationService (INS) to facilitate thedeportation of such persons.9 Theguidelines issued by the Bureau ofJustice Assistance, U. S. Departmentof Justice to implement theseprovisions prescribe specific INSreporting requirements and establishminimum data quality levels andreporting requirements that must bemet in order to obtain a waiver of thefive percent set-aside requirement.10

These requirements and standardsmust be reflected in the methodologyof any audit to assess adherence tothese requirements.

Finally, on February 13, 1991, theFBI and the Bureau of JusticeStatistics issued voluntary reportingand data quality standardsrecommended for adoption by State

8 Crime Control Act of 1990, Pub. L.No. 101-647, 104 Stat. 4850(codified as 42 U.S.C. § 3759(a)).

9 H.R. 3049, The Miscellaneous andTechnical Immigration andNaturalization Service AmendmentsAct of 1991, amending § 503(a)(11)of the Omnibus Crime Control andSafe Streets Act of 1968, as added by§ 507 of the Immigration andNaturalization Service Act of 1990(December 18, 1991).

10 U.S. Department of Justice,Bureau of Justice Assistance,Guidance for the Improvement ofCriminal Justice Records(Washington, D.C.: GovernmentPrinting Office, December 1991).

and local criminal justice agencies aspart of a nationwide effort to upgradethe completeness, accuracy andaccessibility of criminal historyrecords.11 To the extent that thesestandards are adopted by particularStates, they may need to be reflectedin the methodology of auditsconducted in these States if a goal ofthe audits is to assess the degree ofcompliance with these standards.

11 U.S. Department of Justice,Federal Bureau of Investigation andBureau of Justice Statistics,“Recommended Voluntary Standardsfor Improving the Quality ofCriminal History RecordInformation,” Federal Register (13February 1991) vol. 56, no. 30.

Page 10: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 5

Part III

In-House Accuracy and Completeness Checksat the Repository

This section of the Guidedescribes ways to assess someaspects of the completeness andaccuracy of criminal history recorddatabases maintained by State centralrepositories without undertaking sitevisits to local criminal justiceagencies that report information tothe repository. Although these auditmethods may not yield results thatare as accurate and reliable as thoseobtained from site visit audits, theycan be useful for many planning andevaluation purposes. Included aremethods for:

(A) In-house analysis of thecriminal history database;

(B) Comparison of the repositorydatabase with externallyobtained offender processinglists or statistical information;

(C) Comparison of the repositorydatabase with stored sourcedocuments; and

(D) Conducting auditing by mail.

A. Analyzing the CriminalHistory Database

• GoalThe goal of this audit method is to

determine as much as possible aboutthe completeness, accuracy andtimeliness of the repository databaseby manual or programmed analysis ofthe database itself. Audit proceduresthat may be used for this purposeinclude (1) reviewing error lists orother processing lists or reportsroutinely compiled by the repository;(2) conducting a programmedanalysis of the criminal historydatabase; and (3) conducting amanual analysis of randomly selecteddocuments or database entries.

• Procedures— Review Error Lists or OtherReports

Some States have laws orregulations requiring theirrepositories to institute “systematicaudit” processes to minimize thepossibility of recording inaccurateinformation and to provide forappropriate correction and noticewhen materially inaccurateinformation is discovered. Otherlegal provisions dealing withcompleteness require the repositoriesto establish procedures for regularand random system audits to checkon conformance with arrest anddisposition reporting requirements,including compliance with reportingtime limits. Pursuant to theserequirements, many of therepositories have instituted data entryreview and edit processes to guardagainst the entry of erroneousinformation and “delinquentdisposition monitoring” procedures tocheck on conformance with reportingrequirements.

These and other such programsand procedures generate a variety oflists and reports that the auditor canuse to make an assessment of someaspects of data quality. Theseinclude error lists, lists of arrestentries without prosecutor or courtdisposition entries, lists of courtdispositions or correctional segmentswithout corresponding arrest entries,and reports showing the elapsed timebetween the dates that reportableevents occurred and the dates whenthe events were reported to therepository or entered into the criminalhistory system. If lists or reportssuch as these are produced by therepository on a regular basis, theauditor may be able to obtain lists for

any time period consideredappropriate. The auditor shouldinterview repository officials todetermine what types of lists orreports are available.

— Conduct a Programmed Analysisof the Criminal History Database

If such error lists or reports are notproduced on a regular basis, theauditor may be able to generate themusing programs developed expresslyfor audit purposes. For example, inmost automated repositories, itshould be possible to determine thenumber and percentage of arrestevents or case cycles in the criminalhistory database for which final courtdispositions have and have not beenrecorded within appropriatetimeframes. This can be determinedfor the entire database or perhaps forrecent years only, depending on thepurpose and scope of the audit. Itshould also be possible to determinethe number and percentage ofprosecutor or court segmentsreceived that do not havecorresponding arrest entries, and thenumber and percentage ofcorrectional entries that do not havearrest, prosecutor or court entries. Asanother example, most automatedsystems should be programmable toproduce reports showing timelinessof reporting, based on the dates onwhich reportable events occurred andthe dates on which the informationwas received or processed by therepository, if such dates are logged.

Programs used for the generationand analysis of such lists will need totake into consideration the fact thatthe absence of a particular entry in aparticular case cycle does notnecessarily mean that a reportableevent occurred and was not reported.For example, the absence of a final

Page 11: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

6 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

court disposition may be due to thefact that the offender was releasedwithout being charged and the policefailed to report this action, or the factthat the prosecutor declined toprosecute the case and failed to reportthis decision. Similarly, the presenceof a court or correctional entry withno corresponding arrest entry maynot be due to failure of an arrestingagency to report an arrest, but ratherdue to the fact that the case originatedby indictment followed by theissuance of a summons in lieu ofarrest.

It may be possible toaccommodate some of theseconsiderations in the design of theprogram to generate the lists orreports, depending on the way inwhich information is stored in thesystem and the ways it can besearched and analyzed. For example,the system may be able to generatelists of arrests or case cycles that areat least one year old and do not havefinal dispositions; that is, do notinclude trial court dispositions orentries indicating that the cases wereterminated by police release withoutcharging or prosecutor declination.For such cases, at least somedisposition information can beassumed to be missing. Some ofthese missing reportable eventsprobably can be identified byanalyzing the recorded information.For example, if there is a prosecutorentry indicating the filing of chargesin a particular case, there should be acourt disposition (and an arrest entry,unless the record information showsthat the case originated by indictmentor by means other than arrest). Therealso may be other identifiablemissing entries, such as bail orpretrial detention information.

For cases that do have finaldispositions recorded, the systemmay be able to identify other missingreportable events. For example, if aparticular case was terminated by atrial court disposition, there should bea prosecutor segment, an arrestsegment (unless the case can beidentified as one that did notoriginate by arrest) and perhaps other

segments, such as bail or pretrialdetention entries. If the trial courtdisposition was a guilty verdict, thereshould be sentence information and,depending upon the sentence, acorrectional component.

— Conduct a Manual Analysis ofDatabase Entries or Documents

If reviews of the types discussedin the previous section cannot beperformed by automated databaseanalysis, the auditor may be able toperform them by manual analysis of arandom sample of arrests or caseevents selected from the database ofcases being audited. For example, ifthe purpose of the audit is to examinethe completeness of records of felonycases initiated during the past fiveyears and the system is able toidentify felony arrests occurringduring that period that do not havefinal dispositions but is unable toperform any further analyses of thetypes described above, the auditormay randomly select a manageablesample of such identified cases to beprinted out for manual analysis.These transcripts can be examined todetermine whether the recordedinformation indicates that reportableevents are missing. In some cases, itmay be clear that information ismissing. In other cases, the auditormay need to contact appropriatecriminal justice agencies to confirmthat the cases are still activelypending or that reportable eventshave occurred that are not reflectedon the sample transcripts. Dependingupon the size and randomness of thesample, the results of such manualanalyses may be used to estimate,with acceptable levels of confidence,the state of completeness of thebroader segment of the criminalhistory database being audited — inthis example, all felony arrestswithout dispositions identified bysystem analysis. (Random samplingtechniques are discussed in Part IV.)

In systems that cannot performany of the types of automatedanalyses described in the previoussection, the auditor will need to relyentirely upon the manual analysis of

sample criminal history transcripts.As noted, however, analysis of arandomly selected sample can yieldresults that may be attributed withsome degree of confidence to thelarger database.

As mentioned earlier, the fact thatparticular reportable transactions areidentified, through the above-described methods, as missing fromcriminal history transcripts does notnecessarily indicate that theresponsible criminal justice agenciesfailed to report the transactions.Rather, the repository may havefailed to enter reported information,or the information may have beenrejected by system edits and notcorrected by the reporting agency dueto the failure of the repository toreturn error lists or the failure of theagency to process those lists.Another possibility is that theinformation may have been reportedproperly and entered into the criminalhistory system, but may not havebeen linked to the proper case cycledue to the failure of the reportingagency to include appropriatetracking numbers or other linkinginformation or the failure ofrepository personnel to enter thelinking information accurately.

To understand the reasons formissing information, the auditorshould examine such factors as dataentry backlogs at the repository andshould review procedures forgenerating and processing error lists.If unlinked disposition information ismaintained in a separate segment ofthe criminal history database, theauditor should determine whetherparticular entries can be locatedthrough the use of agency casenumbers or by other means. It maybe possible to trace a sample ofmissing disposition information andto determine that some of theinformation was reported and enteredinto the system but failed to link. Inthis way, it may be possible toestimate, with acceptable accuracy,the percentage of missing entries ofvarious types that were reported butfailed to link and to establish why thelinkage failures occurred.

Page 12: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 7

Finally, timeliness of reporting canbe assessed manually by selecting asample of reporting documents thatarrive at the repository on a particularday during the audit and noting theelapsed time from the dates on whichthe reported events occurred. Thetimeliness of data entry or fingerprintcard processing by the repository canbe assessed by the same method. If,for example, a review indicates thatmost fingerprint cards are arriving atthe repository within three to fivedays after the date of arrest but thefingerprint cards being processed onthe day of the review are two to threeweeks old, this indicates a significantprocessing backlog.

B. Comparing theDatabase with ExternallyObtained Information

• GoalThe methods of auditing the

completeness of the criminal historyrecord database listed in Section Arely upon analyzing the database todetermine whether certain reportableevents are missing, or appear to bemissing, and making somedeterminations or assumptionsconcerning whether the eventsactually occurred and were notreported or recorded. Anothermethod of auditing completeness —and accuracy, to some extent — isfirst, to establish by reference toexternally obtained information thatparticular reportable events did occurand second, to examine the criminalhistory database to determinewhether they are completely andaccurately reflected on theappropriate records. This can bedone by obtaining lists (or totals) ofprocessed cases from reportingcriminal justice agencies or fromother sources.

• Procedures— Assess Arrest Reporting Levels

Arrest reporting levels can beassessed by comparing the number ofarrests reported to the repository

during a given period (the overallnumber and the number per agency,if available) with Uniform CrimeReports statistics or other arreststatistics available from the FederalBureau of Investigation, the State’sstatistical analysis center or othersources. Although these methodsmay not yield precise comparisons, itshould be possible to determine theapproximate level of arrest reportingin the State and possibly to identifyparticular agencies that appear not tobe complying with reportingrequirements.

If available statistics of this kindare inadequate or insufficientlycomparable to yield reliable results,arrest reporting levels can be assessedon a sampling basis by obtainingstatistics, lists or other documentsdirectly from selected arrestingagencies for comparison with thecriminal history database. Forexample, some arresting agenciesmay routinely compile statistics oreven lists of fingerprintable arreststhat can be provided to the repository.Others that do not routinely compilesuch statistics or lists may be able togenerate them for a particular period,upon request, for purposes of theaudit. Lists are especially usefulsince they may include subjectidentification information and casenumbers (such as arrest numbers orother tracking numbers) that willenable the auditor to determinewhether particular arrests areincluded in the repository database.

If lists of these types are notavailable, the auditor may be able tocompile lists from copies ofdocuments that can be obtained fromarresting agencies. Some repositoriesrequire arresting agencies to submitcopies of booking sheets, arrestdockets or similar documents to therepository on a regular or periodicbasis. In other States, copies of suchdocuments covering a designatedperiod could be obtained for auditpurposes, at least from selectedsample agencies. Depending uponthe information maintained in suchsource documents, the auditor may beable to assess both the incidence and

the accuracy of arrest reporting bydetermining whether fingerprint cardswere received for all of the listedarrests and whether the identificationand charge information on the cardsmatched the information on thebooking documents. If the repositorylogs the date of receipt of fingerprintcards, as it should, compliance withreporting time requirements can bemeasured by comparing the dates ofreceipt with the dates of arrest.Arresting agencies also may be ableto provide lists of arrests in which thearrested persons were releasedwithout charging; these can then beused to determine whether or not thatinformation is included in therepository database.

— Assess Reporting by OtherAgencies

Similar methods can be used toassess the incidence and, to someextent, the accuracy of reporting byother criminal justice agencies. Forexample, prosecutors may be able toprovide lists of cases filed or not filedduring particular periods, includingcases that originated by indictmentand summons. Courts may be able toprovide lists of cases filed andadjudicated and of convictedoffenders sentenced to probation orincarceration. Parole and probationagencies and correctional institutionsmay be able to provide lists ofpersons admitted to or released fromincarceration or supervision. Again,some of these agencies may alreadygenerate such lists on a regular basis.Others that do not may be able togenerate them upon request for auditpurposes. Comparison of such listswith the repository database canprovide a highly accurate test ofdatabase completeness and of agencyreporting rates, particularly if the listsinclude tracking numbers or otherdata that enable the auditor toaccurately locate the listedtransactions in the criminal historysystem or to determine with certaintythat they are not included.

If auditors can obtain these typesof lists in a form that permitsautomated comparison with the

Page 13: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

8 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

repository’s criminal historydatabase, it may be possible to auditall listed events for the periodaudited. If the lists cannot beobtained in such form or if thenumber of cases is too great, it maybe necessary to undertake manualcomparisons on a sample basis.

If agency processing lists of thetypes described above cannot beobtained, it may be possible to obtainstatistical information that may beuseful for audit purposes. Forexample, if a particular agency canprovide the number of reportabletransactions occurring during aselected period, such as the numberof convictions on felony charges inparticular courts, the repositorydatabase can be analyzed todetermine whether that number wasreported. Although this method doesnot yield results as accurate andreliable as those obtained throughcomparison of lists of identifiabletransactions with the database, it canyield useful information about thecompleteness of the database.

C. Comparing theDatabase with StoredSource Documents

• GoalThe goal of this audit method is to

assess the accuracy of data entryprocedures at the repository bycomparing the information on aselected sample of criminal historytranscripts with fingerprint cards,case filing notices, dispositionreporting forms or other suchincoming source documents on file atthe repository.

• ProceduresSome repositories perform

comparisons of this type on a regularbasis as part of ongoing data qualitymaintenance procedures. If recordsof such comparisons are available,the auditor should obtain them forreview. These records may besufficient to enable the auditor to

determine whether data entryprocedures are adequate to preventthe entry and storage of inaccurateinformation.

If independent assessment isconsidered appropriate, the auditorshould select a random sample ofstored reporting documents forcomparison with repository databaseentries. The scope of the databasesegments that can be audited in thisway will be limited by the types ofsource documents that are on file atthe repository, the period of time thestored documents cover, and thedifficulty of locating and searchingthem. In any case, because mostrepository systems are continuouslyenhanced and because data entryprocedures usually are changed fromtime to time to incorporate additionaledits and other data qualitysafeguards, it will probably be moreuseful to select the sample cases fromrelatively recent records in order toassess the efficiency of data entryprocedures currently in effect.However, the auditor may wish toinclude some older records in orderto compare audit results for thoserecords with audit results for morerecent records to assess the impact ofnewly implemented data entryprocedures.

The types of in-house comparisonsthat can be undertaken will bedetermined by the manner in whichinformation is reported to therepository and the types of reportingdocuments maintained by therepository. If fingerprint cards for allreported offenses are maintained bythe repository (or are obtainable fromthe identification bureau, if it is aseparate agency), the auditor canconfirm that positive identificationwas accurately established in thesample cases and also can determinewhether arrest information wasaccurately entered from thefingerprint cards. If custodyfingerprint cards submitted bycorrectional institutions aremaintained, the auditor can assess theaccuracy of the correctionalcomponents of the sample cases. Ifprosecutor and court information is

reported on paper forms and theseforms are kept on file, the auditor canassess the accuracy of data entryprocedures for this type ofinformation. If certain types ofinformation is reported on magnetictape, the extent to which these typesof in-house audits can be performedwill depend upon such factors aswhether the tapes are retained andstored and whether they can betranscribed for audit purposes.

In selecting the sample databaseentries and reporting documents to beaudited, the auditor can selectseparate samples for each type ofinformation (for example, offenderidentification data, arrest charge data,prosecutor data, court dispositiondata and correctional data). Or theauditor can select sample case cycles,and all of the recorded informationfor these cases can be audited againstall of the stored source documentspertaining to them. The advantage ofselecting separate data segmentsamples is that equal numbers of eachtype of reported information can beselected, whereas selection on a casecycle basis may not yield sufficientcourt and correctional segments foraudit purposes, depending on themethod of selection. An advantageof auditing at least some entire casecycles is that this method of audit canreveal ways in which data entryerrors at one stage of processing canaffect the entry of other types ofinformation. For example, inaccurateentry of a case tracking number froma fingerprint card can cause a linkagefailure of subsequently reporteddisposition information. The auditormay be able to confirm that somedisposition information for samplecases was reported and accuratelyentered, but failed to link because ofearlier data entry errors.

It should be kept in mind that thismethod of auditing can measure onlythe accuracy of data entry proceduresat the repository. It cannot determinewhether all reportable transactionsthat occurred were reported norwhether reported informationaccurately reflects what actuallyhappened in the cases.

Page 14: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 9

D. Auditing by Mail

• GoalAnother method of auditing the

repository database withoutundertaking site visits to localagencies is to mail auditquestionnaires or data collectionforms to the agencies and request thatthey make necessary comparisons orprovide requested information byreturn mail. This method of auditingcan be used to check both accuracyand completeness.

• ProceduresTo audit the accuracy of the

repository database, sample criminalhistory entries can be printed out andmailed to the submitting agencieswith a request that the information becompared with agency sourcedocuments and either confirmed asaccurate or corrected, if necessary.Completeness can be assessed byselecting a sample of cases thatappear to lack dispositioninformation and sending lists of thesecases, with available case numbersand other identifying information, tothe appropriate agencies requestingthat they determine from their recordswhether dispositions have occurred.If dispositions have occurred, theagencies can be requested to providecopies of source documents or tosubmit completed reporting forms toenable the repository to update therecords.

This method of auditingnecessarily depends upon thecooperation of the audited agencies.Cooperation can be increased byrecruiting high-ranking officials, suchas the Attorney General or the StateCourt Administrator, to contact localagencies and encourage them torespond fully. Even with suchassistance, however, it is likely thatnot all agencies will respond fully,despite follow-up mail and telephonerequests, and some may notcooperate at all. In addition, theauditor cannot rely entirely upon theaccuracy of the responses. Despite

these shortcomings, however, thismethod of auditing can yield usefulresults and may be the only methodof “outside” auditing possible ifresources are not available for sitevisits to reporting agencies. Incombination with some of the otherin-house audit methods discussedabove, auditing by mail can providean assessment of data quality that isreliable enough for many purposes.Procedures for selecting sample casesfor mail auditing and appraising theaccuracy of audit results are generallythe same as those discussed in PartIV of the Guide, which deals with on-site visit audits.

Page 15: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

10 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Electronic Editor’s Note: This page was left blank intentionally.

Page 16: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 11

Part IV

Site Visits to Reporting Agencies

This section of the Guidedescribes procedures for auditingsample criminal history records byconducting site visits to the agenciesthat report case processinginformation to the repository. Thismethod of auditing is the mostaccurate and reliable because theauditor is able to determine fromoriginal records of entry on file at theaudited agencies what reportableevents actually transpired in thesample cases and to determine theextent to which the requiredinformation about these events wascompletely and accurately reportedin a timely manner and completelyand accurately entered into thecriminal history record system. Inaddition, by observing agencyprocedures and interviewing agencypersonnel, the auditor can determinethe reasons for data qualitydeficiencies and can formulaterecommendations to remedy them.

Site visit auditing is expensive,however, especially if enoughagencies are audited and sample sizesare large enough to yield audit resultsthat can be attributed to the entirecriminal history record database witha high degree of confidence. For thisreason, it is likely that most audits ofState repository databases willcombine limited site visit audits withsome of the other audit methodsdescribed in Part III of the Guide.For example, the auditor may usesome of the methods of in-housedatabase analysis and reviewdescribed in Part III to obtain anoverall view of the completeness andaccuracy of the criminal historydatabase and may only perform sitevisit audits of selected agencies tovalidate records selected fromsegments of the criminal historydatabase rather than the entirehistorical database. Furthermore, forpurposes of particular audits, it may

be deemed acceptable to select fewerrecords for audit than would berequired to obtain highly accurateresults. Finally, the auditor mayutilize mailed audit forms to validatesamples of information reported by atleast some of the agencies not chosenfor an on-site audit. By combiningaudit methods in this way, the auditorshould be able to devise an auditmethodology that can be supportedby available resources and that willyield results that are adequate formost audit purposes.

As pointed out earlier, the Guideenvisions that site visit auditing maybe undertaken for purposes of twotypes of audits:

(1) Annual or periodic audits of therepository database; and

(2) Audits conducted as part of anongoing program of localagency audits to verifycompliance with reporting andother legal requirements.

In the first type of audit, theauditor will select a random sampleof the types of information beingaudited from among all of the entriesof that type in the database — such asall felony arrests occurring in theState during the past five years —and will validate these samplerecords by comparing them withoriginal entry records maintained atthe agencies that submitted theinformation. Since the records areselected at random from among allentries of a particular type in thedatabase, the sample is likely toinclude records submitted by most ofthe agencies in the State that reportinformation of the type being audited,with larger agencies heavilyrepresented and smaller agenciesrepresented by only a few records.The large number of reportingagencies typically encompassed

within this type of auditing is theprimary reason it is so expensive.

Site visit auditing may also beundertaken as part of an ongoingprogram of local agency audits todetermine whether they arecomplying with reportingrequirements. In such cases, thesample record entries to be validatedfor completeness and accuracy areselected from among entriessubmitted by those agencies selectedfor audit. For this reason, the auditresults can be reliably attributed onlyto the audited agencies themselves.While the results of particular auditsof this type cannot be assumed toaccurately reflect the quality of thebroader repository database, thecumulative results of numerous suchaudits can provide a usefulassessment of overall repository dataquality levels.

Although the criteria for selectingagencies and records to be auditedmay vary depending on the purposeof the audits and the nature of theaudit program undertaken, theauditing procedures described belowcan be used for any type of auditingthat includes site visits to reportingagencies, including audits to assesscompliance with requirements otherthan data quality. The procedurescomprise a generic audit approachthat should be workable in mostStates. The approach set out here,however, may need to be tailored inparticular jurisdictions, depending onthe purpose and scope of the audit(s)to be undertaken, and many of theprocedures described below will needto be refined to conform to localconditions, practices and legalrequirements.

The following sections describeprocedures for:

(A) Selecting agencies to beaudited;

Page 17: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

12 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

(B) Selecting sample cases to bevalidated;

(C) Formulating an auditmethodology;

(D) Completing pre-audit tasks;and

(E) Conducting the site visitaudits.

A. Selecting Agenciesto be Audited

• GoalThis section describes procedures

for selecting agencies for on-siteaudits. Because such audits areextremely time-consuming and Staterepositories have very large numbersof State and local criminal justiceagencies reporting information tothem, it is clear that some method forselecting agencies to be audited mustbe developed that yields amanageable and affordable numberof agencies, yet provides audit resultsthat are as valid as possible. The aimshould be to audit as many agenciesas resources permit and to selectagencies that provide asrepresentative a review as possible ofthe aspects of data quality targeted bythe audit.

• ProceduresAgency selection procedures

depend upon the type of auditundertaken and the audit approachused. If it is a data quality audit of asegment of the repository databaseand is designed to yield highlyaccurate and reliable results, thesample cases selected for validationwill need to be chosen at random andmay well include cases fromhundreds of agencies scatteredthroughout the State. Site visits to allof these agencies may be impossible.In many States, however, a relativelysmall number of agencies in largemetropolitan areas may account for ahigh percentage of the sample cases.If this is the case, it may be feasibleto schedule site visits for all of theselarge agencies and to rely on other

methods of audit for the otheragencies. For example, mailed auditforms may be used for the otheragencies and, if they do not respondafter follow-up contacts, site visitaudits may be scheduled for some ofthem. The reliability of the responsesreceived by mail can be increased byrequiring the responding agencies toprovide copies of source documentsused for validating sample cases.Although it is likely that responseswill not be received for all samplecases, it should be possible to obtainresponses in enough cases to ensurethe accuracy and validity of the auditrecords.

If the audit program to beundertaken is an ongoing series oflocal agency audits to policecompliance with reporting and otherlegal requirements, selection criteriamight differ from those describedabove. Such a local agency auditprogram might be designed torespond primarily (at least for thefirst few years) to problems ordeficiencies identified throughsystematic sampling and otherongoing data quality monitoringtechniques employed by therepository. As noted earlier, mostrepositories employ a variety ofsystematic auditing procedures andgenerate various lists and reportsreflecting the results of theseprocedures. On the basis of thesereports and other availableinformation, the repository may beable to identify particular data qualityproblems that are common tonumerous agencies, such as poorcourt disposition reporting, orparticular agencies that appear tohave serious reporting problems. Theagency selection criteria for the auditprogram might properly be weightedtoward inclusion of such agencies.Pursuant to such an approach, aparticular State might decide to auditonly arresting agencies or courts forthe first few years of the local agencyauditing program.

As another example, the selectioncriteria for a particular year might beweighted to include agencies thathave recently implemented

automated reporting procedures orother procedural enhancements thatimpact data quality. These andsimilar factors might properlyinfluence the selection of agencies tobe audited in a given year or years solong as the on-going audit program isdesigned to include, over a period ofyears, agencies of all sizes, types andgeographic location: large and smalljurisdictions, urban and ruralagencies, high- and low-volumeagencies, automated and manualagencies, arresting agencies,detention centers, prosecutors’offices, trial and appellate courts andcorrectional agencies.

Another factor affecting theselection of agencies to be includedin a local agency audit program isthat it is generally more economical,and in some cases more useful, toconduct audits on a jurisdiction basisrather than on a single-agency basis.Instead of selecting agencies from alist of eligible agencies of a particulartype without regard to where they arelocated (which might result ininclusion of only one agency in mostaudited jurisdictions), the auditorwould select a number ofjurisdictions — cities, counties orjudicial districts, for example — andaudit all of the criminal justiceagencies in those jurisdictions:police departments, prosecutors,courts and correctional agencies.This approach obviously saves travelcosts and auditors’ time. It alsoyields results that reflect the qualityof all types of information. Perhapsmore important, however, is that thisapproach enables the auditor tounderstand how record creationprocedures, reporting procedures andeven case processing procedures inparticular agencies can affectreporting procedures and data qualitylevels in other agencies in thejurisdiction. This helps the auditor tobetter understand the reasons forparticular deficiencies or problemsand to formulate more responsive andrealistic recommendations forremedying them. This auditapproach is particularly effective ifthe same sample cases are audited in

Page 18: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 13

the various agencies in a particularjurisdiction; that is, if the auditorselects sample cases from thejurisdiction and audits all of theinformation concerning those casesthat was reported or should have beenreported by all of the agenciesinvolved in processing the cases. Forexample, if a particular sample casebegan by arrest and progressed to aconviction and incarceration of theoffender, the auditor would followthe audit trail of that case through theentire process and ultimately to therepository and the sample recordtranscript. If the court dispositionwas not shown on the transcript, theauditor might find that the courtfailed to report the disposition; thatthe court accurately reported therequired disposition and sentencedata but reported an inaccuratetracking number passed along by thepolice; or that the court reported allrequired information fully andaccurately, but the information failedto link properly because the policefailed to fingerprint the offendersince he was already in custody on anearlier case. There are many nuancesof interagency relationship and manycase processing peculiarities of thistype that can affect reportingperformance and that might bemissed by an audit approach focusingon only one criminal justice agencyin a particular jurisdiction.

B. Selecting SampleRecords

• GoalThis section discusses a number of

ways of selecting sample records toaudit and a number of factors thataffect the selection methodology andthe sample size. The goal is to selecta sample that is designed to yieldaudit results of the accuracy andreliability desired while simplifyingas much as possible the tasks oflocating the sample records and theagency files that must be accessed tovalidate them.

• Procedures— Devise a Selection Methodology

Factors to consider in devising arecord selection methodologyinclude, among others, the types ofrecords to be selected, and where andhow the records will be selected. Thetypes of records to be selected will bedetermined primarily by the purposeand scope of the audit. For example,if the audit is part of a continuingprogram of regularly scheduled localagency audits and the agencies to beaudited in a particular year are policedepartments, the auditor would selectreportable events (arrest and chargeinformation) submitted by thoseagencies during that year. These casesegments might be selected directlyfrom the repository’s criminal historydatabase or from its fingerprint file.

On the other hand, if the audit is tobe conducted on a complete-casebasis in selected jurisdictions, theauditor would (1) select sample casesprocessed in the particularjurisdictions; and (2) obtaintranscripts of all of the informationconcerning these cases submitted tothe repository by all of the agenciesinvolved in processing them —police agencies, prosecutors, courtsand correctional agencies. Forexample, the sample cases could beselected from among all arrest casesreported to the repository by thearresting agencies in the selectedjurisdictions during the periodcovered by the audit. However, thisselection process has thedisadvantage of including manysample cases that do not containprosecutor, court or correctionalsegments, since some arrests do notresult in charges being filed, whilesome cases that do result in chargesbeing filed are terminated beforeprogressing all the way through thesystem. In order to ensure that thesample cases include enough courtand correctional segments to providea reliable audit of these types ofinformation, the auditor can select thesample cases from among cases inthe repository database that includethese data segments. This method of

selection might be particularlyappropriate if a major concern of theaudit is to assess the accuracy ofcourt or correctional reporting.

As suggested above, sample casescan be selected directly from therepository database. They can alsobe selected from the files of auditedagencies. For example, if a majorgoal of a particular audit is tomeasure the completeness andaccuracy of felony court dispositionreporting, the sample cases might beselected from among cases filed inthe felony trial courts of theparticular jurisdictions during theperiod covered by the audit. Thisensures the inclusion of trial courtsegments in all of the sample cases(and a high proportion of correctionalsegments as well) and provides amore reliable assessment ofreporting, since cases may beincluded in the sample that were notreported to the repository and mightnot have been included in the audit ifsample cases had been selected onlyfrom the repository database. Adisadvantage of this selectionapproach is that additional site visitsto audited agencies may be requiredto select the sample cases. Anotherdisadvantage is that, depending uponthe filing and numbering systems ofthe audited agencies and the ways inwhich cases can be retrieved from therepository database, it may bedifficult to locate all of the criminalhistory transcripts and fingerprintcards for the selected cases. Theauditor should discuss this issue withrepository officials (if he is notalready familiar with these factors)and should devise a case selectionapproach that will (1) yield samplecases that include the types of caseinformation to be audited; (2)simplify the location of samplecriminal history transcripts and otherdocuments (for example, fingerprintcards and reporting forms) at therepository; and (3) simplify thelocation of case files at all of theaudited agencies.

Even if the primary auditmethodology calls for selectingsample cases from the repository

Page 19: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

14 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

database, the auditor might considerselecting a few cases from the files ofthe audited agencies as a “reverseaudit” of compliance with reportingrequirements. For example, if theaudited agency is a policedepartment, a few arrests forfingerprintable offenses might beselected from the department’sbooking log during the audit andinquiries might later be made at therepository to determine whetherfingerprint cards were submitted asrequired for all of those arrests.

— Generate Sample Case Lists

Once the sample case selectionmethodology has been determined,sample case lists can be produced in anumber of ways. Statistical samplingsoftware is available that cangenerate a list of randomly selectedcase numbers from the population ofcases to be audited, based uponstarting and ending case numbers forthe segments of the database to beincluded. The auditor also can userandom number tables which can begenerated by computers or can beobtained from statistical samplingtextbooks.12 Such textbooks alsodescribe other sample case selectionmethods, including the selection ofevery nth case from random startingpoints in the segments of casenumbers to be audited. Whatevermethod of selection is employed, theauditor should generate more samplecase numbers than the number ofrecords needed for inclusion in thesample (sample size is discussed inthe next section). This will facilitatethe selection of extra sample cases to

12 See, for example, Arthur J.Wilburn, Practical StatisticalSampling for Auditors (New York:Marcal Dekker, Inc., 1984); HerbertArkin, Handbook of Sampling forAuditing and Accounting, 3d. ed.(New York: McGraw-Hill Book Co.,1984); and Henry P. Hill and others,Sampling in Auditing (Huntington,N.Y.: Robert E. Kreiger PublishingCo., 1979).

replace cases that may need to beexcluded from the sample becausethey are deemed inappropriate.These may include noncriminalmotor vehicle cases, nonsupportcases, game law violations or othertypes of cases that may be includedwithin the sampled population butmay not be considered appropriatefor audit. Extra sample cases mayalso be necessary because of casenumbering errors or other suchanomalies. A good rule is to selectabout one and one-half times thenumber of case numbers needed.

— Select a Sample Size

As pointed out earlier, for sometypes of auditing it may not benecessary to employ samplingtechniques designed to yieldstatistically significant or reliableresults in a technical sense. Factorssuch as time, resources, availabilityof personnel and the intended use ofaudit results can properly influencethe selection method and sample sizeand the resultant reliability of auditresults. For example, if the primarypurpose of a particular review of therepository’s criminal history databaseis to determine, for planning purposesonly, the approximate level ofreporting for a particular type ofinformation, it might not beconsidered necessary to draw apurely random sample of a sizesufficient to yield highly reliableresults. A one-month chronologicallisting of inquiry responses might beused and the audit results for thesecases might be consideredsufficiently reliable even though theyprovide only an inexact estimate.

On the other hand, if the purposeof the audit sampling is to derivecompleteness rates and accuracy ratesto be included in a report to a Statelegislative budget hearing or in areport prepared in response to a courtorder, the sampling techniquesutilized might need to be moreprecise. Similarly, if the audit isundertaken pursuant to a Federal orState law, that law (or regulationsissued under it) might expresslyrequire that the audit be performed in

accordance with establishedstatistical sampling procedures. Insuch cases, the auditor will need toselect a sample size and employsample case selection techniques thatwill yield audit results that can beattributed with a high degree ofconfidence to the larger databasefrom which the sample cases aredrawn. If the auditor does not havethe requisite expertise in these areas,the auditor should consult someone inthe office who has statisticalsampling experience and knowledgeor someone in some other Stateoffice, such as the statistical analysiscenter, the office of the legislativeauditor or the statistics department ofa State university.

Generally, however, if highlyaccurate and reliable audit results aredesired, the required sample size forparticular degrees of accuracy andreliability can be determined by usingsampling tables obtained fromstatistical sampling textbooks such asthose cited earlier or computersoftware programs designed forstatistical sampling. For eithermethod, the auditor will need to befamiliar with the following concepts:(1) population size and samplingframe; (2) population variance (orerror rate); (3) the desired“confidence level” and “precisionfactor” of the audit results; (4) one-tailed versus two-tailed tests; and, (5)statistical “power.”

Population size is the total numberof cases in the criminal history file orfile segments from which the samplecases will be drawn. If, for example,the sample cases are to be drawnfrom cases filed in the felony trialcourts of a given jurisdiction duringthe previous five years, and thosecourts number cases consecutively bycalendar year, the population sizecould be determined by the beginningand ending case numbers for theyears to be sampled. If cases are tobe drawn from the criminal historydatabase of the repository orsegments of that database, populationsize probably could be determined oraccurately estimated by computer

Page 20: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 15

analysis of the database segments tobe included.

A sampling frame is the actual listfrom which the sample is selected.The sampling frame may be a pre-existing list (for example, a policebooking sheet) or it may consist of acomputer-generated list of numberswhich correspond to an agency’sfiling system. It is critical that theauditor understand that a properlydrawn sample will provideinformation that is statisticallyrelevant to the population of elementscomprising the sampling frame —and nothing more. It is common forpersons to generalize their findings tothe population — indeed, this is theusual purpose of drawing a sample —but the validity of the generalizationdepends upon the extent to which thesampling frame reflects thepopulation.

Population variance indicates howhomogeneous the underlyingpopulation is, and this affects thesample size. For example, if allpersons in a population had the sameblood type, then a sample of oneperson would suffice. On the otherhand, if everyone had a differentblood type, then everyone wouldhave to be included in the sample.Thus, the more homogeneous thepopulation, the fewer cases required.The average time to receipt of finaldisposition could be used as anexample for repositories. IfRepository A’s average time rangedfrom zero to six months, whileRepository B’s average time rangedfrom zero to one year, thenRepository A’s data would be morehomogeneous. All else being equal,an audit of average time atRepository A could be conductedwith a smaller sample size thanpossible at Repository B.

For most criminal history recordaudit purposes, population variancewill be generally equivalent to theerror rate, or noncompliance rate, inthe sampled population with respectto a particular attribute of dataquality. For example, if the purposeof a particular sampling project is toestablish the level of reporting of

felony court dispositions, the errorrate would be the percentage of suchdispositions that are not reported. Asanother example, if the purpose ofsampling is to determine the level ofaccuracy of reported dispositioninformation, the error rate would bethe percentage of reporteddispositions that include inaccurateinformation.

As pointed out above, populationvariance (error rate) is an importantfactor in determining required samplesizes. The challenge for auditors isthat error rates generally are notknown in advance and must beestimated. Such estimates can bebased upon prior audits or reviews,computer analyses of the typedescribed earlier or other knownstatistical data. It is important thatthese estimates be as accurate aspossible because (1) computedsample size affects the amount oftime and resources that must beallocated for the sampling project;and (2) the reliability of audit resultsderived from sampling will beadversely affected if the error ratederived from examination of thesample records turns out to besignificantly greater than theestimated error rate used indetermining the sample size (seeexamples below).

Precision factor and confidencelevel define the accuracy andreliability of audit results and can beestablished in advance according tothe goals of the audit. The precisionfactor refers to the estimation range(for example, plus or minus xpercentage points) within which auditstatistics derived from examination ofsample cases can be assumed toreflect the quality of the populationof cases sampled. A commonexample of precision factor in adifferent context is when the mediareports polling results that show thatcandidate A is preferred by 30percent of voters, plus or minus threepercentage points. Confidence levelrefers to the certainty with which theaudit results derived fromexamination of the sample cases canbe attributed (within the chosen

precision factor) to the entirepopulation of cases from which thesample was drawn. For example, achosen confidence level of 95 percenttogether with a chosen precisionfactor of plus or minus threepercentage points would mean thatthe audit results derived from thesample cases could, with 95 percentconfidence, be attributed to thesampled population within a range ofplus or minus three percent. In otherwords, an error rate of eight percentcomputed from the sample wouldmean that it is 95 percent certain thatthe error rate in the sampledpopulation is between five and 11percent.

Excerpts from statistical samplingtables designed for determiningsample sizes are set out in Figures 1through 3. These tables show howsample size is affected by thepopulation size, the estimated errorrate, and the chosen values forconfidence level and precision factor.Confidence level and precision factorhave the greatest impact on samplesize. For example, Figure 1 showsthat for a database (population) of100,000 records of the type beingaudited, with an estimated error rateof 5 percent or less, the size of arandom sample needed to yield auditresults with a confidence level of 95percent and a precision factor of plusor minus three percent would be 202records. Increasing the desiredaccuracy by narrowing the precisionfactor to plus or minus twopercentage points would more thandouble the necessary sample size to454 records. Figure 2 shows that forthe same example of 100,000 records,increasing the desired confidencelevel from 95 to 99 percent wouldincrease the sample size needed for aprecision factor of plus or minusthree percent to 349 records and forplus or minus two percent to 782records.

Surprisingly, compared toconfidence level and precision factor,the size of the population of recordsbeing sampled has only a slightimpact on sample size. In theoriginal example cited in Figure 1

Page 21: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

16 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

(estimated error rate of five percentor less, confidence level of 95 percentand precision factor of plus or minusthree percent), increasing thepopulation size from 100,000 recordsto 500,000 records would increasethe sample size by only one record(see Figure 1).

As stated earlier, however, theestimated error rate in the populationof records from which the sample isto be selected does have a significantimpact on sample size. Figure 3 is anexcerpt from a table designed fordetermining sample sizes for a 95percent confidence level and anestimated error rate of 10 percent.The number of records that must bedrawn from a population of 100,000records for a precision factor of plusor minus three percent is 383, almosttwice the number needed if theestimated error rate were fivepercent, as in the earlier example.For this reason, audit results derivedfrom random samples selectedpursuant to the procedures describedhere are accurate to the indicateddegrees of confidence level andprecision factor only if the actualerror rate encountered whenvalidating the sample records is equalto or less than the estimated error rateused in determining the sample size.If the actual error rate in the samplerecords proves to be substantiallylarger than the estimated error rate,the reliability of the audit results willbe significantly affected because thesample will have turned out to be toosmall. In such a case, more recordswill need to be added to the sampleor, if this is not practical, thereliability of the audit results must berevised. Figure 4 is an excerpt from atable that can be used in such cases to“appraise” the audit results; that is, tocompute the actual precision factorfor the audit results based uponpopulation size, chosen confidencelevel and the actual error rate. Thetable shows that for the originalexample cited above (population of100,000 records, confidence level of95 percent and desired precisionfactor of plus or minus three percent),if the actual error rate in the sample

records turns out to be 15 percentinstead of the five percent estimateused in determining the sample size,the actual precision factor of the auditresults would be plus or minus fivepercentage points (10.4 percent to20.7 percent). This underlines theimportance of estimating error ratesas accurately as possible. If thesample turns out to be too large,unnecessary work will have beendone; if the sample proves to be toosmall, the audit results will not be asaccurate and reliable as expected.

In some cases, an auditor mayhave to decide whether to use a one-tailed or a two-tailed sampling test.A two-tailed test can identifysignificant differences in eitherdirection from a designated standard,whereas a one-tailed test will identifydifferences in only one direction. Forexample, if repository officials areinterested in determining whether therepository’s accuracy level for aparticular type of information issignificantly better or worse thansome standard, a two-tailed testwould be appropriate. On the otherhand, if the officials are onlyconcerned with whether they aredoing worse than the standard, a one-tailed test would be appropriate. Theadvantage of using a one-tailed test isthat, all else being equal, it does notrequire as many sample cases inorder to obtain comparable precisionfactors and confidence levels.

When an audit is conducted todetermine if some sort of standard isbeing met (for example, there may bea requirement that 95 percent of allrecords be complete), it is critical thatthe auditor consider the “power” ofthe statistical approach utilized.Statistical power refers to how likelyit is that a given test will identifydeviations from a standard. Thegreater the power of the test, themore likely it will identify deviationsfrom the standard. Power is affectedby several factors, but a key factor isthe number of cases sampled. Thegreater the number of cases sampled,the greater the power of a test. Inorder to calculate the power of a test,an auditor must determine how large

a deviation from the standard has tobe in order to be consideredimportant, because the detection oflarge deviations is obviously mucheasier than the detection of slightdeviations from a standard. Themore subtle the effect to be detected,the greater the number of cases thatwill be required.13 (Because of theircomplexity, sample tables fordetermining sample sizes fordifferent power levels are notincluded in this Guide.)

13For a more detailed discussionregarding statistical power issues,please see: Helena C. Kraemer andSue Thiemann, How Many Subjects?Statistical Power Analysis inResearch (New York: SagePublications, 1987).

Page 22: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 17

Pop. Sample Size for Precision of:Size ±.5% ±1% ±1.5% ±2% ±2.5% ±3% ±4%

3,000 1135 639 396 267 190 1103,100 1149 643 398 267 190 1103,300 1175 652 401 269 191 1103,500 1200 659 404 270 192 1103,700 1222 666 406 271 192 1113,900 1243 672 409 272 193 111

4,000 1253 675 410 273 193 1114,500 1299 688 414 275 194 1114,700 1315 692 416 275 194 1115,000 1337 698 418 276 195 1125,500 1370 707 421 278 196 1126,000 1400 715 424 279 196 1126,500 1425 722 426 280 197 112

7,000 1448 727 428 281 197 1127,500 3700 1468 732 430 282 197 1128,000 3817 1486 737 432 282 198 1128,500 3932 1503 741 433 283 198 1139,000 4031 1517 744 434 283 198 1139,500 4128 1531 748 435 284 199 113

10,000 4220 1543 751 436 284 199 11310,500 4306 1555 753 437 285 199 11311,500 4465 1575 758 439 285 199 11313,000 4675 1600 764 441 286 200 11314,500 4856 1621 769 442 287 200 11315,000 4851 1627 770 443 287 200 11316,500 5061 1643 774 443 287 200 113

19,000 5274 1665 778 446 288 201 11320,000 5348 1672 780 446 288 201 11322,000 5482 1685 783 447 289 201 11324,000 5595 1696 785 448 289 201 11426,000 5699 1705 787 448 289 201 11428,000 5790 1713 789 449 289 201 114

Figure 1Statistical Sampling Table Excerpt14

(Sample sizes for sampling attributes for random samples only. Expected rate ofoccurrence not over 5% or expected rate of occurrence not less than 95%.

Confidence level 95% (two-tailed); 97.5% (one-tailed).)

14 Excerpted from Herbert Arkin, Handbook of Sampling for Auditing and Accounting, 3d. ed. (New York: McGraw-HillBook Co., 1984) pp. 334-335.

Page 23: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

18 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Pop. Sample Size for Precision of:Size ±.5% ±1% ±1.5% ±2% ±2.5% ±3% ±4%

30,000 5871 1720 790 449 290 201 11432,000 5944 1727 791 450 290 202 11434,000 6010 1732 793 450 290 202 11436,000 6069 1737 794 451 290 202 11438,000 6123 1741 795 451 290 202 11440,000 6173 1745 795 451 290 202 11445,000 6282 1754 797 452 291 202 114

50,000 6370 1761 799 452 291 202 11460,000 6508 1771 801 453 291 202 11470,000 6610 1779 802 453 291 202 11480,000 6689 1784 803 454 291 202 11490,000 6752 1789 804 454 291 202 114

100,000 6803 1792 805 454 292 202 114150,000 6961 1803 807 455 292 203 114

200,000 7043 1809 808 455 292 203 114250,000 7092 1812 809 455 292 203 114300,000 7126 1814 809 456 292 203 114400,000 7169 1817 810 456 292 203 114500,000 7196 1818 810 456 292 203 114

Figure 1 (Continued)

Page 24: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 19

Pop. Sample Size for Precision of:Size ±.5% ±1% ±1.5% ±2% ±2.5% ±3% ±4%

12,000 2496 1254 740 484 340 19412,500 6274 2517 1259 741 485 341 19413,000 6398 2537 1264 743 486 341 19414,000 6631 2573 1273 746 487 342 19415,000 6847 2605 1281 749 488 342 194

16,000 7050 2633 1288 751 489 343 19517,000 7236 2659 1294 753 490 343 19518,000 7411 2682 1299 755 491 344 19519,000 7575 2704 1304 757 491 344 19520,000 7730 2721 1309 758 492 344 195

21,000 7874 2741 1313 760 493 344 19523,000 8141 2772 1320 762 494 345 19525,000 8377 2799 1326 764 494 345 19527,000 8592 2823 1331 766 495 346 19629,000 8783 2843 1336 767 496 346 196

30,000 8873 2851 1338 768 496 346 19632,000 9041 2869 1342 769 497 346 19635,000 9264 2892 1346 771 497 347 19638,000 9462 2909 1351 772 498 347 19640,000 9581 2920 1353 773 498 347 19643,000 9745 2937 1356 774 498 347 196

45,000 9843 2944 1358 774 499 348 19648,000 9980 2958 1361 775 499 348 19650,000 10063 2963 1362 776 499 348 19655,000 10250 2981 1366 777 500 348 19665,000 10552 3006 1371 779 501 348 19680,000 10884 3033 1376 780 501 349 197

100,000 11189 3056 1381 782 502 349 197130,000 11487 3077 1385 783 502 349 197150,000 11623 3085 1387 784 503 349 197180,000 11775 3098 1389 785 503 350 197

200,000 11852 3101 1391 785 503 350 197250,000 11994 3111 1393 785 503 350 197375,000 12189 3124 1395 786 504 350 197500,000 12289 3132 1396 787 504 350 197

Figure 2Statistical Sampling Table Excerpt15

(Sample sizes for sampling attributes for random samples only. Expected rate ofoccurrence not over 5% or expected rate of occurrence not less than 95%.

Confidence level 99% (two-tailed); 99.5% (one-tailed).)

15 Ibid., p. 356.

Page 25: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

20 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Pop. Sample Size for Precision of:Size±.5% ±1% ±1.5% ±2% ±2.5% ±3% ±4%

6,000 2194 1224 756 507 361 2096,500 2257 1243 763 510 363 2097,000 2314 1261 769 513 364 2107,500 2367 1276 775 516 365 2108,000 2414 1290 780 518 367 2108,500 2453 1302 785 520 368 211

9,000 2493 1313 789 522 368 2119,500 2535 1323 792 523 369 211

10,000 2569 1332 796 525 370 21210,500 2601 1341 799 526 371 21211,000 2631 1349 801 527 371 21211,500 2658 1356 804 528 372 212

12,000 2684 1363 806 529 372 21212,500 2708 1369 808 530 373 21213,000 2731 1375 810 531 373 21313,500 2752 1380 812 532 374 21314,000 6957 2773 1385 814 533 374 21314,500 7079 2792 1390 816 533 375 213

15,000 7196 2810 1394 817 534 375 21316,000 7418 2843 1402 820 535 375 21317,000 7626 2873 1410 822 536 376 21318,000 7821 2900 1416 825 537 376 21419,000 8004 2925 1422 827 538 377 214

20,000 8176 2948 1427 828 539 377 21422,000 8491 2988 1437 832 540 378 21424,000 8774 3022 1445 834 541 378 21426,000 9028 3055 1451 836 542 379 21428,000 9257 3077 1457 838 543 379 214

Figure 3Statistical Sampling Table Excerpt16

(Sample sizes for sampling attributes for random samples only. Expected rate ofoccurrence not over 10% or expected rate of occurrence not less than 90%.

Confidence level 95% (two-tailed); 97.5% (one-tailed).)

16 Ibid., p. 337.

Page 26: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 21

Pop. Sample Size for Precision of:Size±.5% ±1% ±1.5% ±2% ±2.5% ±3% ±4%

30,000 9465 3100 1462 840 544 379 21532,000 9658 3120 1467 842 544 380 21535,000 9913 3146 1472 843 545 380 21538,000 10140 3169 1477 845 546 380 21540,000 10277 3182 1480 846 546 380 21542,000 10405 3194 1483 847 546 381 215

45,000 10579 3210 1486 848 547 381 21550,000 10834 3234 1491 850 548 381 21555,000 11050 3253 1495 851 548 381 21565,000 11404 3282 1502 853 549 382 21575,000 11677 3305 1506 854 550 382 21590,000 11989 3329 1511 856 550 382 216

100,000 12150 3242 1514 857 551 383 216110,000 12287 3352 1516 858 551 383 216125,000 12453 3365 1518 859 551 383 216140,000 12587 3374 1520 859 552 383 216150,000 12663 3379 1522 859 552 383 216

200,000 12936 3398 1525 861 552 383 216275,000 13167 3414 1529 862 553 384 216350,000 13303 3423 1530 862 553 384 216425,000 13340 3430 1532 863 553 384 216500,000 13459 3433 1532 863 553 384 216

Figure 3 (Continued)

Page 27: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

22 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

For Sample Size of:

and 60 100 200 300 500 1,000 2,000Field Lower Upper LowerUpper LowerUpper LowerUpper LowerUpper Lower Upper LowerUpper

Size is: Limit Limit Limit Limit Limit Limit Limit Limit Limit Limit Limit Limit Limit Limit

200 6.4% 24.7% 10.5% 21.0%300 7.9 25.4 9.8 22.0400 7.7 25.7 9.5 22.4 11.7% 19.0%500 7.6 25.9 9.3 22.6 11.4 19.4

1,000 7.3 26.2 9.0 23.1 10.9 20.1 11.8% 18.8% 12.9% 17.4%1,500 7.3 26.4 8.9 23.2 10.7 20.3 11.6 19.1 12.5 17.82,000 7.2 26.4 8.8 23.3 10.6 20.4 11.5 19.2 12.4 18.0 13.5% 16.7%2,500 7.2 26.4 8.8 23.4 10.6 20.5 11.4 19.3 12.3 18.1 13.3 16.83,000 7.2 26.5 8.8 23.4 10.5 20.5 11.4 19.3 12.2 18.1 13.2 16.93,500 7.2 26.5 8.7 23.4 10.5 20.6 11.3 19.4 12.2 18.2 13.2 17.04,000 7.2 26.5 8.7 23.4 10.5 20.6 11.3 19.4 12.2 18.2 13.1 17.1 13.9% 16.2%4,500 7.2 26.5 8.7 23.4 10.5 20.6 11.3 19.4 12.2 18.2 13.1 17.1 13.9 16.25,000 7.2 26.5 8.7 23.4 10.5 20.6 11.3 19.4 12.1 18.3 13.1 17.1 13.8 16.36,000 7.1 26.5 8.7 23.5 10.4 20.6 11.3 19.4 12.1 18.3 13.0 17.2 13.7 16.37,000 7.1 26.5 8.7 23.5 10.4 20.6 11.2 19.5 12.1 18.3 13.0 17.2 13.7 16.48,000 7.1 26.5 8.7 23.5 10.4 20.7 11.2 19.5 12.1 18.3 13.0 17.2 13.7 16.49,000 7.1 26.5 8.7 23.5 10.4 20.7 11.2 19.5 12.1 18.3 13.0 17.2 13.6 16.5

10,000 7.1 26.5 8.7 23.5 10.4 20.7 11.2 19.5 12.1 18.4 13.0 17.3 13.6 16.515,000 7.1 26.6 8.7 23.5 10.4 20.7 11.2 19.5 12.0 18.4 12.9 17.3 13.6 16.520,000 7.1 26.6 8.7 23.5 10.4 20.7 11.2 19.5 12.0 18.4 12.9 17.3 13.5 16.525,000 7.1 26.6 8.7 23.5 10.4 20.7 11.2 19.5 12.0 18.4 12.9 17.3 13.5 16.650,000 7.1 26.6 8.7 23.5 10.4 20.7 11.2 19.5 12.0 18.4 12.9 17.4 13.5 16.6

100,000 7.1 26.6 8.7 23.5 10.4 20.7 11.2 19.5 12.0 18.4 12.9 17.4 13.5 16.6

Figure 4Statistical Sampling Table Excerpt17

(Sample reliability for relative frequencies for random samples only.Rate of occurrence in sample is 15%. Confidence level is 95%.)

17 Excerpted from Henry P. Hill and others, Sampling in Auditing (Huntington, N.Y.: Robert E. Kreiger Publishing Co.,1979).

Page 28: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 23

C. Finalizing theAudit Methodology

• GoalSome of the decisions to make and

tasks to accomplish in developing anaudit methodology are discussed inearlier sections of the Guide. Thissection discusses other tasks that arenecessary in finalizing the auditapproach, structuring the tools forconducting the audit, and gatheringand evaluating the information thatwill support the audit findings andrecommendations. These tasksinclude (1) developing data collectionforms for record validations; (2)developing audit questionnaires; (3)creating an audit database; and (4)preparing an audit manual.

• Procedures— Develop Data Collection Forms

Data collection forms used inauditing sample records should bestructured to provide spaces for theauditor to record the results of recordvalidations in such a way as tosimplify the entry of audit results intoan audit database for analysis andreport generation. Additionally, ifcriminal history records are to becorrected on the basis of the audit,the data collection forms shouldprovide spaces for recordingcorrected information or, at aminimum, spaces for indicatingwhich data elements need to becorrected. A suggested approach isto structure data collection forms, aswell as data analysis and auditfindings, on the basis of reportableevents to be targeted by the audit,that is, agency processing steps ordecisions in the criminal justiceprocess that are required to bereported to the repository. Asindicated earlier, a list of reportableevents can be compiled fromapplicable reporting laws andregulations. Such a list might includethe following (or, for particularaudits, only some of the following):

• Arrest information

• Release by police withoutcharging

• Initial court appearance

• Bail information

• Pretrial and post-trial detention

• Misdemeanor court dispositioninformation

• Felony court dispositioninformation

• Appellate court information

• Release pending appeal

• Probation information

• Confinement information

• Parole information

The data collection form shouldinclude spaces for recording auditresults for all of the principalprocessing steps or decisions requiredto be reported to the repository, fromarrest or other case initiation throughfinal release from confinement orsupervision. Other events, such asexecutive clemency or court-orderedchanges in sentences, may beinfrequent enough to be handled onan ad hoc basis and may not meritinclusion on the data collection form.

When the list of reportable eventsto be targeted by the audit has beenfinalized, data collection forms canbe structured to provide spaces forindicating an audit findingconcerning the completeness andaccuracy of each reportable dataelement within each reportable eventand an overall audit finding for thereportable event. This will enable theauditor to record a finding as towhether the event was reported fullyand accurately and, if not, to indicatewhich data elements were inaccurateor missing. This simplifies entry ofaudit results into an audit databaseand provides an easy and useful wayto aggregate audit results and displaythem in the audit report.

A data collection form used tovalidate sample records in an audit ofa State central criminal historydatabase is included as Appendix I.Since it was used in a comprehensiveaudit performed on a complete-casebasis, the multiple-page formencompasses all types of case

information required to be reported tothe repository by the State’s criminalhistory record law. For ease ofreference, pages 1 and 2 of the formappear as Figure 5. On these pages,spaces are provided for the auditor torecord notations or codesrepresenting audit findings forreportable events (arrest, court data,etc.) and for data elements withinreportable events. The codes arelisted at the bottom of the pages. Anotation of “C/A” indicates that anitem of information on a samplerecord was found by comparison withagency source documents to becomplete and accurate. “M”indicates that the data elementwas missing from the sample record.“E” indicates that the informationwas found to be inaccurate in somerespect, and “INC” means that it wasincompletely reported. “N/A” meansnot applicable and “NSD” means thatno agency source document could belocated to validate a particular dataelement or reportable event. Finally,“LE” means “linkage error,”indicating that a particular item ofinformation was reported to therepository but did not appear on thesample criminal history recordbecause of a tracking number error orother error or omission.

As noted, the form includes spacesfor validating the completeness andaccuracy of individual data elements,as well as of reportable events. Forexample, if all of the informationrequired to be reported for aparticular reportable event werefound to be completely andaccurately shown on the samplerecord, the auditor would record“C/A” for each included data elementand “C/A” for the reportable event.If, on the other hand, the auditorfound that a court failed to reportparticular charge information butaccurately reported all other requiredinformation, he would mark thecharge information “M” and wouldmark the reportable event (“courtdata”) “INC.”

Page 29: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Electronic Editor’s Note:

Figure 5, the Sample Data Collection Form Excerpt, (pp. 24-25)

is not available electronically.

Page 30: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

26 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

In the State in which this audit wasconducted, some agencies reportinformation to the repository bymeans of local computer systems.For this reason, the data collectionform includes spaces for indicatingwhether particular errors oromissions were caused by thesesystems. The form also includesspaces for recording agency casenumbers, agency identificationnumbers and offender identificationnumbers. Although these numbersare not always shown on criminalhistory records, they often are enteredinto criminal history databases andare useful in searching the databases,in locating agency case files and indetermining whether particularinformation not shown on samplerecords was reported but notsuccessfully linked. Finally, the formincludes spaces for recording thedates on which reportable eventsoccurred. These dates can be used todetermine whether the events werereported in a timely fashion.

As mentioned earlier, the audit inwhich the form in Appendix I wasused was conducted on ajurisdictional or complete-case basis;thus, this multiple-page datacollection form provides spaces forvalidating all of the informationabout particular sample casesreported (or not reported) by all ofthe criminal justice agencies involvedin processing the cases. In addition,the list of reportable data elementsconforms to the format of thecriminal history record, since thisparticular audit approach involved (1)validating all of the informationshown on or missing from the samplecriminal history records; and (2)making determinations as to whethererrors or omissions resulted fromreporting deficiencies at the localagency level or from data entry errorsat the repository.

Figure 6 sets out an example of adata collection form developed foruse in an audit that differssignificantly in approach from theone discussed above. This form wasdeveloped for use in a program ofcontinuing audits of local criminal

justice agencies conducted by a Statecentral repository. The auditapproach calls for most of thescheduled audits to be conducted onan individual agency basis. Thus, thedata collection forms were structuredseparately for individual agencies:arresting agencies, State’s attorneys,courts and custodial agencies. Figure6 is the form used for auditingarresting agencies. Since data entryprocedures at this particularrepository are audited yearly by anindependent State agency, the auditapproach for the local agency auditprogram provides for auditing thecompleteness and accuracy ofinformation on fingerprint cards anddisposition reporting forms submittedto the repository; that is, for auditingthe incidence and accuracy ofreporting by local agencies, but notthe accuracy of data entry at therepository. Thus, the structure of thedata collection forms follows theformat of the reporting documentsused by local agencies.

Finally, because this State intendsto use the data collection forms forcorrecting erroneous records, theforms provide spaces both forindicating whether particular dataelements were reported fully andaccurately (the small blocks in theupper left corners of the data fieldblocks) and larger spaces for writingin corrected information as necessary(the titled data field blocks). Like themultiple-page form set out atAppendix I, this form uses codenotations (set out at the bottom of theform) to indicate audit findings andprovides spaces for recording auditfindings at both the data elementlevel and the reportable event level.

Other data collection formsdeveloped for this local agency auditprogram are provided in Appendix II.These forms and the two discussedabove are examples of formsdeveloped for particular States foruse in particular audit programs.While they are useful as examples, itshould be stressed that data collectionforms must be tailored to the criminalhistory record format, reportingprocedures and criminal justice case

processing procedures of particularjurisdictions, as well as to the auditmethodology to be employed.

— Develop Audit Questionnaires

Questionnaires should bedeveloped to use in conductinginterviews with agency personnelduring audit site visits to assess theadequacy of reporting procedures andother procedures that affect dataquality. These questionnaires canensure that audit interviews areconducted in an orderly fashion andthat all relevant issues are addressed.Like the data collection forms, auditquestionnaires should be structured toconform to the particular auditapproach being used and to therequirements of applicable laws,regulations and reporting procedures.Questions should be included todetermine what types of files theaudited agency maintains, whetheragency officials and personnelunderstand applicable legalrequirements and their reportingduties, and what procedures theagency employs to ensure full andaccurate reporting to the repository.The questionnaire forms also shouldinclude spaces for the auditor toindicate whether the agency is incompliance with legal requirements,and spaces for recording commentsconcerning agency procedures thatare not in compliance with applicablerequirements, including suggestionsfor remedying such deficiencies.These notations and comments can beused in preparing audit reports (seePart V).

Page 31: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

AuditCharge Finding*

1

2

3

4

CJIS Audit Data Collection FormArrest Information

Agency Audited

Audit Case No. Auditor Date

Subject Name

DCN PCN

CHARGE AND DISPOSITION INFORMATIONEnter Corrected Information as Appropriate

SUBJECT IDENTIFICATION INFORMATIONEnter Corrected Information as Appropriate

*Audit Findings:A = Accurate and Complete; E = Erroneous; I = Incomplete; M = Missing; NA = Not Applicable; NSD = No Source Document

Subject Name LAST FIRST MIDDLE Birthdate Month Day Year

Also Known As - AKA Alias DOB Month Day Year

Sex Race POB Hair Color Skin Tone Height Weight Eyes Photo Taken

Yes No

Scars, Marks, Tattoos Misc. Number Social Security Number Driver’s Lic. Number

State

Agency’s Off. ID Number Agency’s Case Number Date Printed Month Day Year

Warrant County Warrant CourtStatute Citation CSA Class Offense Description Type of Issue Case No. Disposition

OTHER INFORMATIONEnter Corrected Information as Appropriate

SUMMARY AUDITFINDING *

REPORTABLE EVENT AUDIT FINDING*

Summary AuditFinding *

Summary AuditFinding *

Summary AuditFinding *

Date of Arrest Date of Offense County of Juvenile Pros. Post Sentence Inquire Only Month Day Year Month Day Year Prosecution as Adult Fingerprints

By Law Ct. Order Yes No Yes No

Prints signed by officer Yes No Prints signed by subject Yes No Officer ID noted Yes No Agency ORI

Agency Name Comments

Page 32: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

28 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Appendix III sets out an auditquestionnaire developed for use inthe local agency audit programreferred to above. Like the datacollection forms discussed earlier,this questionnaire reflects specificlegal requirements and reportingprocedures in the State in which it isused. In addition, it includesquestions on areas of recordmanagement other than data quality,including dissemination, security,personnel training and record subjectaccess/review. The form does,however, illustrate the types ofquestions that should be includedconcerning agency file maintenanceand reporting procedures. It alsoshows how a questionnaire form canbe structured so as to facilitate thepreparation of audit reports. All ofthe questions on the form that have“Yes/No” blocks in the left marginare based upon specific legalrequirements in the State. Thismeans that any “No” block that ismarked by the auditor indicates somerespect in which the audited agencyis not in compliance with a reportingor recordhandling requirement basedupon Federal or State law. Each suchquestion has a space for comments inwhich the auditor can set out specificinformation as to the reasons for thefinding of noncompliance and cansuggest ways in which compliancecan be achieved. This informationcan later be incorporated into theaudit report as described in Section Vof this Guide. The questionnaire alsoincludes spaces for recordingidentifying information about theaudited agency, the names and titlesof agency officials and recordspersonnel, and the names andtelephone numbers of persons to becontacted for follow-up questionsconcerning the audit.

— Create an Audit Database

For particular audits, the numberof sample records involved and thescope of the review may be such thatanalysis of audit data and preparationof tables, charts and graphs forinclusion in the audit report or reportscan be accomplished manually. At

best, however, this is a tediousprocess and if the audit is extensiveor part of a series of continuingaudits, manual analysis of audit datamay be impossible. For this reason,arrangements should be made to storeaudit data in a computer, if possible.Spreadsheet and databasemanagement software packages areavailable that can greatly simplify theentry and storage of sample caseinformation and can make suchinformation easily retrievable forfollow-up audit tasks and for analysisand report generation purposes. Ifthe audit is part of a series ofcontinuing local agency audits,computer storage of audit datafacilitates the aggregation of suchdata over a period of time formanagement purposes, as well as forgenerating periodic reports.

In addition to information aboutsample cases and audit results, theauditor can store miscellaneousagency identification numbers andcase processing numbers that cansimplify the production of lists of keynumbers for use in locating case filesat audited agencies. The auditor canuse these numbers to pull files duringthe audit or provide them to theagency in advance so that case filescan be pulled prior to the audit andmade conveniently available to theauditor. These numbers can also beuseful if follow-up questions arisethat need to be referred back toaudited agencies for response.

Audit data should be entered in thecomputer database in such a way asto provide easy retrieval ofinformation reflecting the results ofrecord validation reviews. This canbe facilitated by using notations orcodes such as those discussed earlier— C/A, M, E, INC, LE, etc. Inaddition, if necessary for reportgeneration or management purposes,additional symbols or codes can beused to facilitate the entry andanalysis of information about specifictypes of errors or omissions. Forexample, if the audit reveals a patternof occurrence of particular types oferrors in reporting particular dataelements, such as using an erroneous

disposition code or sentence code inreporting court dispositioninformation, the auditor might devisea special error code or symbol, suchas “E1,” to store this information.Use of such error codes makes itpossible for these particular types oferrors to be aggregated and reflectedin the tables and narrative of the auditreport.

Using a computer to s tore auditdata also makes it easy to enter theresults recorded on auditquestionnaires concerning agencycompliance with particular legalrequirements, such as reportingdeadlines. This simplifies thegeneration of individual audit reportsand, for ongoing local agency auditprograms, the generation of summaryreports based on particular legalrequirements, particular types ofagencies, particular geographic areasor other such bases for aggregatingand analyzing audit results. Suchcumulative data can be a usefulmanagement tool for monitoringlocal agency compliance withparticular legal requirements. Inaddition, although the recordvalidation component of individualagency audits may involve too fewrecords to be statistically significant,record validation results cumulatedover a period of time can provide auseful and reliable assessment of dataquality.

— Prepare an Audit Manual

Once the audit methodology hasbeen finalized, including thestructuring of data collection formsand questionnaires, it isrecommended that an audit manualbe prepared describing the auditapproach in detail (including methodsfor selecting sample cases) andsetting out specific instructions forsuch tasks as completing the datacollection forms, administering theaudit questionnaires and enteringaudit results in the audit database.The manual should also summarizethe pre-audit procedures set out inpart D of this section and theprocedures set out in part E forconducting on-site audits. As

Page 33: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 29

experience is gained in conductingaudits, the manual can be revised andaugmented as necessary. Inparticular, instructions can be addedconcerning the types of sourcedocuments found to be mostappropriate for validating varioustypes of sample case information.Sample copies of such sourcedocuments can be included as anappendix to the manual to assist otherauditors in locating them.

A manual of this type will beespecially helpful in conducting acontinuing local agency auditprogram, particularly if it iscontemplated that additional orreplacement auditors will be engagedas the audit program progresses.

D. Completing Pre-AuditPlanning

• GoalThis section of the Guide

describes other tasks that must becompleted prior to beginning on-siteaudits of selected agencies. Itassumes that an audit methodology(including data collection forms,audit questionnaires and samplerecord selection procedures) has beenfinalized and that some or all of theagencies to be audited have beenselected. The tasks set out here areapplicable to site visits undertaken aspart of a comprehensive audit of therepository database, as well as tomultiple audits undertaken as part ofan ongoing local agency auditprogram. These tasks include (1)developing an audit schedule; (2)contacting agencies to be audited;and (3) preparing audit folders.

• Procedures— Develop an Audit Schedule

A preliminary audit scheduleshould be developed, based uponconsiderations outlined in precedingsections of the Guide and upon suchadditional factors as the number ofrecords to be validated, the estimatedtime needed to review and, ifnecessary, copy source documents,

the probable ease or difficulty oflocating and pulling case files, theavailability of auditors, necessarytravel time and known schedulingconflicts. The schedule shouldreflect the number of audits to beconducted during the scheduledperiod and should include proposeddates and times for the audits, or atleast for those planned for the firstfew weeks or months. Planning asfar ahead as possible permits auditorsto schedule their time for conductingaudits and for other such activities asreport writing, and also gives theaudited agencies more advance noticeso that they can ensure theavailability of agency personnel andmake other necessary arrangementsto facilitate the audit.

It should be assumed that thepreliminary schedule will requirerevision after initial contact withagencies selected for audit and thatthe schedule will require continuedmonitoring and possible revision dueto scheduling conflicts involvingauditors or agency officials or due toother unforeseen problems. It isadvisable to schedule some flexibilityinto the projected timetable and tohave alternate agencies in mind in theevent that audits for scheduledagencies need to be rescheduled.

— Contact Agencies to be Audited

INITIAL CONTACTS. As part ofpre-audit planning, efforts should bemade to enlist the endorsement andcooperation of high-ranking State andlocal criminal justice officials whocan help to obtain the cooperation ofthe agencies to be audited. Suchofficials as the Attorney General, theCommissioner of Public Safety, theCommissioner of Corrections, andthe Chief Justice of the State’shighest court or the State CourtAdministrator can help pave the wayfor cooperation by agency officials intheir respective departments ofgovernment, resulting in savings inaudit time and costs. Most criminaljustice officials have extremely heavyworkloads and many of them may bereluctant to find time to participate inan audit. In addition, they may be

skeptical about the reasons for theaudit and uneasy about the outcomeand thus not inclined to cooperate. Itcan be helpful to have a high-rankingofficial in their department or branchcontact them to explain the reasonsfor the audit and to urge them tocooperate.

Officials of agencies selected foraudit should be contacted by theauditors, by telephone or mail, wellin advance of the proposed auditdates to confirm the feasibility of thedates and to briefly explain the scopeof the audit and the nature of theassistance the agency will be asked toprovide. In addition, the auditorshould raise any necessary questionsabout such things as the agency’srecordkeeping practices, accessibilityof files and the availability ofphotocopy machines.

AUDIT NOTICE LETTER. Oncea scheduled date is agreed upon, theagency should be sent a formal auditnotice letter confirming the date andadvising in some detail of the legalauthority for the audits; the scope ofthe planned review of agencyactivities and records; and applicablelegal reporting and recordkeepingrequirements. The letter also shouldadvise the agency of any other auditrequirements, such as the need tointerview particular agency officials,the necessity for working space forthe auditors and the necessity formaking arrangements to obtaincopies of source documents and otheragency files.

An audit notice letter developedfor use in the local agency auditprogram referred to earlier isprovided in Appendix IV. The letterincludes a pre-audit questionnaireform that agencies are asked tocomplete and return to the auditors.This form requests informationconcerning the size of the populationserved by the agency, case processingvolumes, the types of files and sourcedocuments maintained, how suchfiles are organized and numbered,and whether the agency has writtenpolicies governing record preparationand handling and reporting to the

Page 34: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

30 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

repository. Information provided inresponse to questionnaires such asthese can help the auditor make finalplans for the audit and perform anyadditional pre-audit work that may beindicated.

For example, the pre-auditquestionnaire response may indicatethat some of the reporting formssubmitted to the repository (arrestfingerprint cards, for example) areessentially original sourcedocuments; that is, that theinformation on them is original ratherthan obtained from a previouslycompleted document. This mayindicate that the auditor will not needto plan for validation of these typesof sample records. As anotherexample, the questionnaire responsemay confirm that a sheriff’s office isboth an arresting agency and a pre-trial or post-trial detention facility,necessitating a broader audit thanoriginally envisioned. Informationsuch as this can help the auditorensure that enough time is scheduledfor the site visit and that appropriatesample records, data collection formsand audit questionnaires are takenalong.

Of necessity, the questionsappropriate for inclusion in such pre-audit questionnaires will varydepending upon the type of agency tobe audited. It is probable that aseparate form questionnaire will needto be developed for each type ofagency included within the scope ofthe audit — arresting agencies,courts, prosecutors, etc.

— Prepare Audit Folders

When all sample records (criminalhistory transcripts, fingerprint cards,disposition reporting forms, etc.) andother relevant documents for aparticular agency have been selectedand obtained, audit folders for theagency should be prepared.

CASE FOLDERS. A numberedfolder should be prepared for eachsample record to be validated duringthe site visit and a copy of all casedocuments, together with a datacollection form, should be placed in

the folder. If, during the audit, copiesof agency source documents areobtained for record correction oraudit documentation purposes, thesecopies should be placed in the samplecase folders also.

AGENCY FOLDERS. A largeragency folder should be prepared tostore all of the documents relating tothe audit of a particular agency. Thisfolder should contain all of thesample case folders, copies of theaudit notice letter and othercorrespondence, appropriate auditquestionnaires, copies of agencypolicy statements and other materialsthat may be necessary or usefulduring the audit, such as copies ofapplicable laws and regulations andreporting form instruction sheets orcode tables. The folder should alsocontain any other relevantinformation about the agencyavailable to the auditor. Forexample, if the repository logs thedate of receipt of reportedinformation, it should be possible todetermine prior to the site visitwhether all of the sample reportableevents were submitted in a timelymanner or perhaps to obtain otherinformation about timeliness ofreporting and overall data qualitylevels derived from the repository’ssystematic audit program. Anyavailable information of this typeshould be included in the agencyaudit folder for use in interviewingagency personnel during the audit.

E. Conducting the Audits

• GoalThis section discusses the major

tasks involved in performing on-sitedata quality audits. They are (1)conducting an entry interview; (2)touring the agency’s recordkeepingoperations; (3) validating samplerecords; (4) locating appropriate files;(5) reviewing “extra-agency” papers;(6) obtaining copies of auditdocuments; (7) selecting samplecases for “reverse audit”; and (8)conducting an exit interview.

• Procedures— Conduct an Entry Interview

At the start of the on-site audit, theauditor should schedule a conferencewith agency officials and recordpersonnel to explain the legal basis,purpose, scope and approach of theaudit. If some of this informationalready has been provided in theaudit notice letter, only a briefsummary should be necessary. Theauditor should also reviewrequirements for such things as workspace, a tour of agency recordkeepingoperations, assistance in locating filesand assistance in obtaining copies ofdocuments. Arrangements should bemade for obtaining copies of bookingsheets, docket books or whateverdocuments are necessary to identify aspecified number of reportable eventsfor reverse auditing as a further checkon reporting, if this is to be done (seethe section on Select Cases forReverse Audit , Page 32). Ifarrangements have not been made inadvance for pulling sample case files,the sample case numbers and otheridentifying information should beprovided to agency personnel at thispoint to enable them to begin locatingand pulling the files, if this isnecessary to speed the audit process.

At the entry conference, theauditor should summarize the dataquality and reporting laws andregulations applicable to the agencyand should review any specificreporting requirements that may notbe entirely clear to agency personnel.Any information the auditor hasobtained in advance about theagency’s reporting performance orthe quality of information reported bythe agency should be reviewed andany data quality problems alreadyidentified by the auditor should bediscussed. If the necessary agencypersonnel are present at theconference, this may be theappropriate time to complete theaudit questionnaire.

It is likely that the auditor willneed to explain to at least someagency personnel the precise

Page 35: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 31

reporting requirements applicable tothem and the reasons for requiringthe reporting of particular items ofinformation in particular ways. Forexample, arresting agencies may notbe fully aware of their duty tofingerprint persons already in custodywho are charged in new and separatecases or they may not understand thecritical importance of assigning andreporting case tracking numbers.Prosecutors may not understand theimportance of reporting cases theydecline to prosecute or reportinginformation about chargemodifications. Court personnel maynot understand the importance ofreporting a disposition and sentence,if any, for each charge, the necessityof using designated disposition codesand sentence codes, or theimportance of obtaining thefingerprints of convicted persons whohave not already been fingerprinted.For these reasons, the auditor mayfind that, to agency personnel, theaudit is as much instructional as it isan assessment of their performance.Care should be taken to havesufficient copies of applicable lawsand regulations, instruction forms andcode tables available in case agencypersonnel do not have current copies.

Finally, the auditor should explainthat the agency will be afforded anopportunity to comment on auditfindings before they are finalized, ifthis is to be the case. He should alsoexplain how the report will beutilized and distributed, such aswhether it will be made available tothe public, the State legislature or theGovernor.

— Tour the Agency’s RecordOperations

At the conclusion of the entryinterview, the auditor should requesta guided tour of the agency’s recordcreation and storage areas, and areaswhere computer terminals and otherequipment are located. During thistour, the auditor should confirminformation provided during the entryinterview and in response to the auditquestionnaire and should resolve anyremaining questions about the

agency’s procedures for recordingand storing criminal history recordinformation and reporting to therepository. It may be appropriate toobserve record personnel at work orto question some of them about theirunderstanding of their duties. Theauditor may wish to review sampledocuments being processed at thetime of the audit and note the dates ofoccurrence of reportable events todetermine whether there are dataentry backlogs or reporting delays.Finally, this tour of agencyprocedures and examination ofagency files will enable the auditor toconfirm whether validation of sampleinformation is necessary and todetermine which files and papers arethe appropriate original sourcedocuments for record validationpurposes.

— Validate Sample Records

If it is appropriate to validatesample records by comparison withagency source documents, the auditorshould make arrangements to havethe sample case files located andbrought to a convenient work area (ifthis has not already been done), or tohave direct access to the necessaryfiles (if this is more feasible). It isusually helpful to have agencypersonnel available during the recordcomparison process to answerquestions that may arise and to helplocate and interpret source documentsor entries.

— Locate Appropriate Files

The purpose of the recordvalidation process is to determinefrom official agency sourcedocuments what actually happened ata particular processing stage in asample case and to determinewhether the processing agency fullyand accurately reported the requiredinformation about the event to therepository. In some audits, anadditional purpose may be todetermine whether the repositoryitself accurately entered theinformation and recorded it on theappropriate criminal history record.

For record validation purposes, theauditor will want to locate the mostofficial and reliable record of theevent maintained by the agency.Usually, this will be the first recordcreated, that is, the “original source”record of the event. These recordsmay be separate pieces of paper, suchas court orders, documents completedby court clerks in the courtroom,fingerprint cards or indictment forms;or they may be ledgers, such as arrestbooking sheets or court docket books.They may be partly both. Theauditor will need to determine whichrecords, files or ledgers are the mostreliable recordings of the sampledreportable events.

As noted earlier, the auditor mayfind that some of the informationreported to the repository is reportedin essentially original form and neednot be verified by reference to anyother record. For example,sometimes the subject identificationinformation, arrest event informationand charge information on an arrestfingerprint card is originalinformation — that is, it is initiallyrecorded directly on the fingerprintcard rather than copied from someother document. In some agencies,however, some or all suchinformation may be taken from other,earlier records, such as arrest bookingsheets or incident reports. Theauditor will need to understandexactly how case processing andrecord creation takes place in theagency to determine whethervalidation is appropriate and, if so, todetermine which documents are themost reliable records for validationpurposes.

— Review Extra-agency Papers

In addition to providing a meansof validating information reported bythe audited agency, case filesmaintained by the agency mayinclude copies of records created byother agencies involved in processingthe sample cases. If the audit is beingconducted on a jurisdiction orcomplete-case basis (that is, if entirecases are being audited in all of theagencies involved in processing the

Page 36: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

32 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

cases), these extra-agency copies canhelp the auditor compile completelists of all of the processing steps thatoccurred in particular cases and caneven be used to validate reportedinformation about some of theseother reportable events. For example,case files maintained by trial courtsoften contain copies of sourcedocuments forwarded by otheragencies, such as police arrest reportsand statements of charges,arraignment documents, bail orders,pretrial commitment or release ordersand charging documents. If aparticular case resulted in aconviction and a sentence ofimprisonment, the trial court casejacket may contain copies ofcorrectional reception or releasedocuments, probation or paroledocuments, or at least notationsindicating whether these eventsoccurred. If the case was appealed, acopy of the notice of appeal andpossibly a notation of the outcome ofthe appeal should be in the trialcourt’s case file. If the case began byindictment and summons rather thatby arrest, this should be evident fromthe court file. Use of such documentsas these can save time by making itunnecessary to spend as much timevalidating records in other agencies.For this reason, it is usually a goodidea in complete-case audits to beginthe audit at the trial court of theaudited jurisdiction and then to auditother agencies as necessary.

— Obtain Copies of Audit Documents

At a minimum, the auditor shouldobtain copies of all source documentsrelied upon to conclude that reportedinformation is missing, erroneous orincomplete. These copies can beused to clear up questions that mayarise later and can be the basis forcorrecting repository records. Iffeasible, it is a good idea to obtaincopies of all source documents usedin the audit, even if the auditedinformation is found to be accurateand complete. Questions may ariselater concerning exactly whathappened in particular cases andsource document copies may provide

the answer and preclude the necessityfor follow-up telephone calls or evenfollow-up site visits.

— Select Cases for Reverse Audit

If the audit includes this additionalactivity, arrangements should bemade to select a number of recentreportable events from the agency’sfiles as a “reverse audit” ofcompliance with reportingrequirements. Appropriate reviewscan be undertaken later to determinewhether all of the events werereported to the repository in a timelymanner. For example, as part of anaudit of a police agency, amanageable number of recentfingerprintable arrests can be selectedat random from the agency’s bookingsheets or chronological arrest logsand enough information about thearrests can be recorded to determinelater whether or not they werereported to the repository. Inaddition, it may be possible toidentify a few cases in which arrestedpersons were released without beingcharged (after the arrests werereported to the repository) and todetermine, at the agency or at therepository, whether these events werereported. If the agency maintains achronological file, such as an arrestlog, that contains enough informationto enable the auditor to identifyappropriate reportable events and tomake inquiries at the repository todetermine whether or not the eventswere reported, it may be sufficient toobtain copies of several pages of thelog or docket. If this is not possible,it will be necessary to copy therelevant case information from theagency’s files.

The auditor might also explore thefeasibility of making arrangements toobtain copies of logs, dockets or caseprocessing lists or totals on acontinuing basis as an ongoing checkof agency compliance with reportingrequirements and to provide a meansof systematic monitoring of thecompleteness of the repositorydatabase. As discussed in an earliersection of the Guide, this method ofsystematic auditing can be a valuable

management tool and can provide anaccurate means of gauging dataquality.

— Conduct an Exit Interview

When the audit is completed, theauditor should conduct an exitinterview with agency officials toreview the audit and to advise themof any respects in which it is apparentthat the agency is or is not incompliance with legal requirements.The auditor can provide a preliminaryreview of the results of the recordvalidation process and can point outareas of apparent deficiency andsuggest ways to remedy them.Discussing preliminary audit findingsin this way gives agency officials anearly opportunity to take issue withaudit findings, if they desire, or to askquestions about what can be done tobe in compliance. This should reducethe likelihood of later questions orcontroversy about audit findings orrecommendations and shouldfacilitate review and final approval ofthe audit report.

If the audit has revealed seriousdeficiencies in agency procedures ormisunderstandings concerning legalrequirements that the auditor cannotresolve, it may be necessary toschedule on-site training sessions byrepository field personnel or to makearrangements for agency personnel toattend training sessions at therepository or elsewhere. Thesearrangements can be discussed duringthe exit conference. Finally,arrangements should be completedfor any necessary additionalassistance by the agency, such asforwarding copies of sourcedocuments or providing informationabout cases that require follow-upinvestigation.

Page 37: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 33

Part V

Preparing Audit Reports

• GoalUpon completion of an audit, a

report should be prepared settingforth audit results and the auditor’sconclusions and recommendations.The report usually should include (1)a description of the recordhandlingand reporting procedures of theaudited agency; (2) a presentation ofaudit findings; and (3)recommendations made by theauditors. This section of the Guidediscusses these aspects of reportpreparation. It also suggests ways tosimplify the preparation of auditreports in ongoing local agency auditprograms that require the preparationof multiple reports.

• Procedures— Describe Agency Procedures

The report should describe theprocedures used by the auditedagency to record case processinginformation, the types of criminalhistory record files it maintains, andthe procedures it uses to reportinformation to the repository.Recordhandling and reportingprocedures should be described inenough detail to enable the reportreader to understand the auditfindings and the auditor’srecommendations. The level of detailnecessary in a particular report willdepend upon the type of auditundertaken, the auditor’s findings andthe nature and scope of therecommendations. For example, ifthe audit results indicate that aparticular agency is reportingcompletely and accurately in a highpercentage of cases and the auditorhas no recommendations for changesin policies or procedures, thedescription of agency procedures canbe brief. If, however, audit resultsare poor for a particular agency, theauditor will need to describe and

analyze the agency’s reportingprocedures in some detail in order toexplain the problems that caused poorreporting and the reasons forrecommended procedure changes. Ifthe audit is undertaken to assess dataquality levels at the repository andincludes site audits of selectedreporting agencies, the audit reportprobably will need to describe therepository’s system configuration,databases, data entry procedures,systematic data quality maintenanceprocedures and system outputs insome detail and will also need toinclude descriptions of reportingagency procedures, as suggestedabove.

— Present Audit Findings

Audit findings should be presentedboth in narrative and graphic form. Ifaudit information is stored in acomputer in the manner suggestedearlier in this Guide, it should berelatively easy to generate auditstatistics and to display them in avariety of tables, charts and graphs soas to make the audit results easy tounderstand. Such graphicrepresentations can display auditresults by agency, by reportableevent, by year or other timeframe orin other ways, depending on thenature of the audit and audit results.If, as suggested earlier, datacollection forms are structured on thebasis of reportable events (arrest data,bail data, trial court data, etc.) andaudit results are organized and storedin that manner, it should be relativelysimple to present and analyze auditresults in this manner. Because eachreportable event generally representsa separate decision or step in thecriminal justice process that shouldbe reported by the agency responsiblefor the action, the report can focus onthese distinct types of information

and the reporting procedures used bythe agency, and can relatedeficiencies and recommendationsdirectly to audit results. Asrecommended earlier, data collectionforms and audit data storageprocedures also can be structured in away that makes it possible to analyzeand display audit results on the basisof particular data fields, particulartypes of information withinreportable events (for example,subject identification, arrest charge,court charge and count information),or particular types of errors.

In this regard, even though someerrors found in the audit may appearto be trivial while others are moreserious, it is recommended that allerrors or omissions be noted in recordvalidation audits and included inoverall audit statistics. Although aparticular error may seem immaterialin the context of the record on whichit appears, it may well have resultedfrom lax data entry or reportingprocedures that could cause seriouserrors. It may be appropriate,however, for the audit report todistinguish between serious andnonserious errors in some respects.For example, errors in subjectidentification information may becategorized on the basis of whetherspecific errors would cause namesearch failures, and errors in chargeor disposition information may becategorized on the basis of whetherthey would cause a particular recordto be substantively misinterpreted.

— Set Out Recommendations

Where significant deficiencies aredocumented by the audit, the auditreport should set outrecommendations for remedying thedeficiencies. Such recommendationsmay include, among others,suggested changes in agency

Page 38: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

34 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

procedures, implementation of newprocedures or acquisition ofadditional equipment or personnel.As indicated earlier, therecommendations should be relatedas directly as possible to specificaudit results and an analysis ofagency procedures. To assist theagency in formulating plans forimplementing the auditrecommendations, they should beranked in order of importance. It isusually helpful to the reader if majorfindings and recommendations are setout in an executive summary oroverview section of the report.

— Prepare Multiple Reports

If the audits are undertaken as partof a continuing program, such as alocal agency audit program,preparing audit reports can become atedious task that consumes a greatdeal of time that could be moreconstructively spent performingadditional audits. Since audit officesare invariably understaffed, timesaved in report preparation can be animportant factor in the overallefficiency of the program. For thisreason, it is recommended that thepreparation of audit reports incontinuing audit programs be plannedso as to utilize standard language andforms to the extent possible.

For example, a variety of formcover letters and form paragraphssetting out stock findings andrecommendations can be prepared forinclusion at appropriate places inaudit reports. These form paragraphscan: (1) summarize applicable legalrequirements, such as reporting timelimits; (2) provide a standard way ofpresenting particular audit findings,with blank spaces to be completed asappropriate; and (3) set out varioustypes of agency actions or proceduresnecessary to achieve compliance withspecific requirements. As experienceis gained in a particular auditprogram, these boiler-plateparagraphs, letters and forms can berefined and augmented to beapplicable to virtually all auditsituations encountered. It is notdifficult to enter such standard

language and forms into a computerand to establish procedures forretrieving particular materials asnecessary to structure individualaudit reports.

Another way of saving time inreport preparation is to use auditquestionnaires completed during theaudit as part of the audit report. Itwas suggested earlier that suchquestionnaires be prepared andutilized and that they be structured toprovide spaces for indicating ways inwhich an audited agency is or is notin compliance with particular legalrequirements, as well as spaces forcomments by the auditor. If care istaken to ensure that auditors’comments entered on suchquestionnaires are legible,understandable and material, thequestionnaires can be made availableto the audited agency along with thereport. In such cases, the auditreport, generated as suggested above,can be keyed to the order of thequestions on the questionnaire. Sincethe auditor will have discussed theaudit questionnaire and other auditresults with agency officials duringthe exit interview at the conclusion ofthe site visit, there should be nosurprise conclusions orrecommendations. Thus, the type ofbrief, direct report suggested hereshould suffice in all cases exceptperhaps in audits of agencies thathave particularly serious or unusualdeficiencies, in which case majorportions of the audit reports mayneed to be written with originallanguage, tailored specifically tothose agencies and their problems.

Page 39: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 35

Appendices

I Sample Audit Data Collection Form — Jurisdictional Basis

II Sample Audit Data Collection Forms — Local Agency Basis

State’s Attorney Disposition

Court Initiation and Disposition

Custodial Receipt/Status Change

III Sample Audit Questionnaire

IV Sample Audit Notice Letter

Page 40: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Electronic Editor’s Note:

Appendix I, The Sample Audit Data Collection Form, (pp. 37-43)

is not available electronically.

Page 41: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

44 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Appendix II

Sample Audit Data Collection FormsLocal Agency Basis

• State’s Attorney Disposition• Court Initiation and Disposition• Custodial Receipt/Status Change

Page 42: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

AuditCharge Finding*

1

2

3

CHARGE AND DISPOSITION INFORMATIONEnter Corrected Information as Appropriate

SA Disposition DateStatute Citation CSA Class Offense Description Disposition Mo. Day Year

COMMENTS

COMMENTS

COMMENTS

AGENCY IDENTIFICATION INFORMATIONEnter Corrected Information as Appropriate

Agency ORI Agency NameForm Signed

Yes No

REPORTABLE EVENT AUDIT FINDING*

*Audit Findings:A = Accurate and Complete; E = Erroneous; I = Incomplete; M = Missing; NA = Not Applicable; NSD = No Source Document

Summary AuditFinding *

Summary AuditFinding *

CJIS Audit Data Collection FormState’s Attorney Disposition

Agency Audited

Audit Case No. Auditor Date

Subject Name

DCN PCN

Page 43: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

CJIS Audit Data Collection FormCourt Initiation and Disposition

Agency Audited Date

Audit Case No. Auditor

Subject Name

DCN PCN

COURT INITIATION INFORMATIONEnter Corrected Information as Appropriate

Court ORI Court Case Number Court Case Number Date FiledMo. Day Year

Agency Name Comments

Reportable EventAudit Finding *

DISPOSITION INFORMATIONEnter Corrected Information as Appropriate

Disp. Disposition DateOffense Statute Citation CSA Class Court Case Number Code Mo. Day Year

AuditCount Finding*

1

2

3

4

COMMENTS

SummaryAudit Finding *

AuditCount Finding*

COMMENTS

SENTENCE INFORMATIONEnter Corrected Information as Appropriate

Sentence Sentence Length Sentence Sentence DateCode Yrs. Mos. Days Hours Fine Amount Status Code Mo. Day Year

SummaryAudit Finding *

OTHER INFORMATIONEnter Corrected Information as Appropriate

Court ORI Court Name

COMMENTS

SummaryAudit Finding *

Form SignedYes No

REPORTABLE EVENTAUDIT FINDING*

*Audit Findings:A = Accurate and Complete; E = Erroneous; I = Incomplete;M = Missing; NA = Not Applicable; NSD = No Source Document

Page 44: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

SUBJECT IDENTIFICATION INFORMATIONEnter Corrected Information as Appropriate

CJIS Audit Data Collection FormCustodial Receipt/Status Change

Agency Audited

Audit Case No. Auditor Date

Subject Name

DCN PCN

STATUS CHANGE INFORMATIONEnter Corrected Information as Appropriate

Status Change Code Status Change Date Signed by Officer DatedMonth Day Year Yes No Yes No

COMMENTS

RECEIPT INFORMATIONEnter Corrected Information as Appropriate

Confining Inst. ORI Agency Rec'd From ORI Date Rec'd Date Printed Month Day Year Month Day Year

Signed by Official Officer ID No. Noted Signed by Subject

Yes No Yes No Yes No

COMMENTS

ot

t

Subject Name LAST FIRST MIDDLE Birthdate Month Day Year

Sex Race POB Hair Color Skin Tone Height Weight Eyes Photo Taken

Yes No

Scars, Marks, Tattoos Misc. Number Social Security Number

Driver’s Lic. Number State Correctional Number

Court Case Number County Court Case Number County

COMMENTS

*Audit Findings:A = Accurate and Complete; E = Erroneous; I = Incomplete; M = Missing; NA = Not Applicable; NSD = No Source Document

SUMMARY AUDITFINDING*

REPORTABLE EVENT AUDIT FINDING*

Summary AuditFinding *

Summary AuditFinding *

Summary AuditFinding*

Page 45: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Electronic Editor’s Note: This page was left blank intentionally.

Page 46: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 49

Appendix III

Sample Audit Questionnaire

Page 47: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

50 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Auditor

Date

AUDIT QUESTIONNAIRE FOR

COUNTY SHERIFF'S OFFICE

I. AGENCY INFORMATION

A. Agency Name

Address

Telephone

B. Agency Officials Interviewed:

Name Title

C. Records Personnel:

Name Title

D. Official(s) to contact for follow-up questions concerning the audit:

Page 48: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 51

II. CHRI FILES MAINTAINED

1. Determine what files the agency maintains that contain criminal history record information (fingerprint

files, arrest booking system, case jackets, incident reports, intelligence or investigative files, inmate

files, etc.).

2. Are juvenile records maintained separately from adult records or flagged to distinguish them from adult

records?

Yes

No Comment

3. Is CHRI sealed or expunged upon receipt of a court order?

Yes

No Comment

III. REPORTING TO ISP

1. Are arrest fingerprint cards prepared and submitted to ISP for:

Yes

No All persons arrested for felonies or class A or B misdemeanors?

Yes

No All persons already in custody against whom additional charges are filed in

unrelated cases?

Yes

No All persons ordered by a court to be fingerprinted after conviction for

reportable offenses (and not previously fingerprinted)?

Yes

No All persons who commit reportable crimes while incarcerated?

Yes

No All minors under 17 years of age who are arrested or taken into custody for

weapons offenses or forcible felonies as specified in Chapter 38, paragraph

206-5(a)?

Yes

No All minors ordered to be tried as adults pursuant to Chapter 37, paragraph

805-4 et seq.?

Comment

Page 49: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

52 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

2. Is the arrest fingerprint card the original source document for subject identification and arrest charge

information, or is the information on the fingerprint card and arrest reporting form taken from other

records?

❏ Fingerprint card is original source document

❏ Information is taken from other records

Comment

3. Are arrest fingerprint cards for all reportable offenses sent to ISP within 24 hours of the arrests?

Yes

No Comment

4. Is there a procedure in effect for "felony review" by the State's Attorney prior to sending fingerprint

cards to ISP?

Yes

No

Comment

5. Does the agency notify ISP and request an error correction when charges are not referred to the State's

Attorney concerning an individual whose arrest fingerprint card has already been submitted to ISP?

Yes

No Comment

6. Are procedures in place to ensure that copies 2 through 4 of the completed ISP arrest reporting packet

are forwarded to the State's Attorney?

Yes

No Comment

7. Does the agency submit a completed Custodial/Status Change Fingerprint Card to ISP within 30 days of

the initial receipt of a subject for a sentence of imprisonment for a reportable offense?

Yes

No Comment

Page 50: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 53

8. Is the Custodial fingerprint card the original source document for offender identification information and

receipt information or is the information taken from other records?

❏ Fingerprint card is original source document

❏ Information is taken from other records

Comment

9. Does the agency submit to ISP a completed Status Change form (copies 2, 3 and 4 of the Custodial

fingerprint card) within 30 days after any change in status pertaining to the original imprisonment

sentence?

Yes

No Comment

10. Does the agency notify ISP and request an error correction when it discovers errors in previously

submitted Custodial/Status Change information?

Yes

No Comment

11. Does the agency provide training for new officers in the taking of fingerprints and filling out the receipt

and status change reporting forms?

Yes

No Comment

12. Are completed forms reviewed by a supervisor prior to submission to ISP?

Yes

No Comment

Page 51: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

54 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

IV. DISSEMINATION

1. Does the agency have adequate procedures to determine that any agencies or persons (including

noncriminal justice personnel such as reporters or other news media representatives) to whom CHRI

received from ISP is disseminated are legally authorized?

Yes

No Comment

2. Does the agency have procedures to ensure that "update inquiries" to ISP are made to obtain the most

current information prior to any extra-agency dissemination of information received from ISP?

Yes

No Comment

3. Does the agency maintain, for at least three years, logs of all extra-agency disseminations of CHRI

received from ISP?

Yes

No Comment

4. Do such logs include:

Yes

No The identity of the requestor?

Yes

No The authority of the requestor?

Yes

No The purpose of the request?

Yes

No The identity of the record subject?

Yes

No The date of the dissemination?

Comment

5. Is the agency aware that a new user agreement with ISP must be executed if the agency head who signed

the prior agreement is replaced?

Yes

No Comment

Page 52: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 55

V. SECURITY

1. Are all records or files that include CHRI physically located so that access can be controlled?

Yes

No Comment

2. Are adequate procedures in place to ensure that only authorized persons can access CHRI or enter

secured areas?

Yes

No Comment

3. Are adequate procedures in place to ensure that personnel who have access to CHRI files or facilities

can obtain only authorized data and perform only authorized functions?

Yes

No Comment

4. Are all CHRI storage areas and facilities adequately protected by fire detection and suppression devices?

Yes

No Comment

5. Are all computer terminals and other automated equipment that can access CHRI located in secure

areas?

Yes

No Comment

6. Are all computer terminals and printers attended during all hours when they are in use and locked or

made inoperable during non-use or off-duty hours?

Yes

No Comment

7. Does the agency have adequate procedures to provide for the destruction or secure storage of computer

printout sheets that contain CHRI?

Yes

No Comment

Page 53: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

56 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

8. Does the agency employ adequate procedures (e.g., locks, passwords, ID codes) to ensure that only

authorized persons may operate computer terminals that can access CHRI and that they may perform

only authorized functions?

Yes

No Comment

9. Does the agency conduct an adequate background investigation (including a criminal history check) of

all persons authorized to access CHRI or to work with or around CHRI records and facilities (including

janitorial and maintenance personnel)?

Yes

No Comment

10. Does the agency have adequate procedures to ensure that non-fee applicant fingerprint cards are

submitted to ISP only for criminal justice employment in the agency?

Yes

No Comment

11. Does the agency have written agreements with any organizations that provide data processing support

services under which the agency has management control of noncriminal justice personnel who have

access to CHRI?

Yes

No Comment

12. Does the agency have a security officer?

Yes

No Comment

Page 54: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 57

VI. PERSONNEL TRAINING

1. Are all appropriate personnel properly trained and supervised to ensure that they are familiar with legal

requirements applicable to CHRI, such as dissemination limitations, reporting requirements, access and

review procedures and security requirements?

Yes

No Comment

2. Does the agency have a written Standard Operating Procedures (SOP) manual that includes a section on

record handling responsibilities and security/confidentiality requirements?

Yes

No Comment

3. Does the agency have sanctions for misuse of CHRI and for other violations of rules and limitations

applicable to CHRI?

Yes

No Comment

4. Has the agency experienced any incidents involving security violations or record misuse?

Yes

No Comment

Page 55: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

58 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

VII. RECORD SUBJECT ACCESS AND REVIEW

1. Is the agency familiar with the access and review process and the agency's legally-mandated role in the

process?

Yes

No Comment

2. Does the agency maintain adequate copies of the regulations governing access and review and the forms

utilized in the access and review process?

Yes

No Comment

3. Does the agency make copies of the access and review regulations and forms available upon request to

persons being processed or previously processed through the criminal justice system?

Yes

No Comment

4. Does the agency make access and review services available between the hours of 8:00 a.m. and 4:00

p.m. daily except weekends and holidays?

Yes

No Comment

5. Is the fee charged by the agency in accordance with the regulations (related to the costs of processing

reviews and not to exceed $10)?

Yes

No Comment

6. Does the agency utilize fingerprint identification to establish the individual's positive identity?

Yes

No Comment

7. Does the agency comply with the time requirements in the regulations applicable to the forwarding of

forms to ISP and notification of the record subject of ISP responses?

Yes

No Comment

Page 56: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 59

8. Does the agency provide notice of corrected information to all agencies that have received inaccurate

records as required by the regulations?

Yes

No Comment

9. Does the agency, upon request, provide the record subject with a list of noncriminal justice agencies to

which the record has been disseminated?

Yes

No Comment

10. Does the agency keep adequate records of access and review cases to facilitate audit of compliance with

the requirements of the regulations?

Yes

No Comment

Page 57: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

60 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

Appendix IV

Sample Audit Notice Letter

Page 58: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 61

FORM LETTER

PRE-AUDIT NOTICE TO SHERIFF'S OFFICE

Dear ________________:

We have tentatively scheduled an audit of your department for (time) on (date). The purpose of this letter is to confirm the scheduled date and time and to advise you of the natureand scope of the audit. We also are enclosing a brief Pre-Audit Questionnaire that we wouldappreciate your completing and returning.

If the date and time noted above are not agreeable to you, will you please let us know? Also, ifscheduling conflicts arise later, please advise us promptly so that we can reschedule the audit.

The audits we are performing of criminal justice agencies throughout the State are pursuant to( statutory citation ). That Act makes all conviction information maintained by the State centralrepository (including related arrest, sentence and custodial information) available upon request to anymember of the public for any purpose. The Act also imposes upon the State central repository a dutyto maintain complete and accurate criminal history record information and establishes judicialremedies for the negligent dissemination of inaccurate or incomplete conviction information,including actions for civil damages against State or local governmental agencies.

The Act also requires the State central repository to conduct audits of State and local criminal justiceagencies to ensure compliance with the Act and with the law requiring such agencies to report arrest,disposition and custodial information to us ( statutory citation ). We are also auditing compliancewith the provisions of the Interagency Agreements signed by all criminal justice agencies that receivecriminal history record information from us. Those agreements incorporate provisions of State lawand Federal regulations governing criminal history records (28 C.F.R., Part 20), dealing with limits onre-dissemination, security requirements, and requirements concerning the maintenance of records tofacilitate audits, including dissemination logs. Finally, we will audit compliance with AdministrativeRules promulgated pursuant to ( statutory citation ) authorizing record subjects to review andcorrect criminal history record information concerning them maintained by the State centralrepository. These rules require arresting agencies and correctional institutions to provide recordsubjects with facilities for reviewing their records and to assist them in filing challenges if they desireto do so.

The auditors will utilize a questionnaire designed to determine whether your agency is in compliancewith the requirements summarized above. For this purpose, they will need to talk with appropriateagency officials and records personnel. The auditors also will wish to tour the areas of your agencywhere fingerprint files or criminal history files are located or where computer terminals that canaccess State central repository information are located. They may wish to observe the agency’sprocedures for completing arrest and custodial fingerprint cards and submitting them to us.

Finally, the auditors will wish to validate the accuracy and completeness of the information on arandomly selected sample of arrest fingerprint cards, custodial fingerprint cards and status changeforms submitted to the State central repository by your agency in recent months and will select fromyour agency's files a sample of recent reportable events of these types to verify that fingerprint cardsand status change notices were submitted to the State central repository as required by the reportinglaw. They will need access to appropriate agency files for these purposes and will need to obtaincopies of some source documents, probably not exceeding 15 to 20 pages. We would appreciate your

Page 59: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

62 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

cooperation in providing access to the needed files; a desk, table or other work space for the auditors;and access to a photocopying machine or assignment of someone to make photocopies as necessary.

As we have previously advised you, we anticipate that the audit will take no more than three to three-and-one-half hours, and we do not believe that agency officials will necessarily need to be availableduring all of that period. We will strive to ensure that your agency's activities are interrupted to theleast possible extent.

When the audit has been completed, we will prepare a written audit report setting out findings and anyappropriate recommendations. The report will be submitted to you for your comments before it isprepared in final form. Final audit reports are required by the Act to be available to the public uponrequest, and to be provided to the Governor and the General Assembly.

We would appreciate your completing and returning the enclosed Pre-Audit Questionnaire within 10days, if possible. If you have any questions about the audit, please contact this office.

Sincerely,

Page 60: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 63

CRIMINAL JUSTICE INFORMATION SYSTEMPRE-AUDIT QUESTIONNAIRE FOR

SHERIFF'S OFFICE

I. AGENCY INFORMATION

1. Please indicate:

a) The population of the jurisdiction served by the agency .b) The number of arrests per month for fingerprintable offenses _______ .c) The number of offenders received per month to serve sentences of

imprisonment .

2. Please provide an organizational chart for the agency, identifying thedivisions and officials responsible for:

a) the taking of arrest and custodial fingerprints and submission offingerprint cards and status change forms to State central repository

b) security and confidentiality of fingerprint files and criminal history record files

c) employee training regarding recordhandling policiesd) use and security of telecommunications terminals

3. Please provide copies of any written agency policies concerning:

a) the taking of fingerprints and reporting of information to State central repository

b) security of recordsc) access to and dissemination of criminal history records

II. AGENCY FILES/REPORTING TO STATE CENTRAL REPOSITORY

1. Please identify and describe any files that contain criminal history recordinformation received from the State central repository, indicating how theyare organized and numbered.

2. Please describe the procedures for taking fingerprints and completing theArrest Fingerprint Card and Custodial Fingerprint Card.

Page 61: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

64 Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems

3. Please describe the procedures for completing and submitting status changeforms to the State central repository.

4. Is the offender identification information and other information entered onthe Arrest Fingerprint Card and/or Custodial Fingerprint Card taken fromsome other source records (such as arrest reports, incident/offense reports,inmate reception reports or court commitment papers) or are fingerprintcards filled out as "original" records utilizing information provided by theoffender and the arresting or receiving officer?

5. Please identify the types of arrested and incarcerated persons (by offensetype and offender type) whose fingerprint cards are submitted to the Statecentral repository.

6. Please describe the procedures for submitting Arrest and CustodialFingerprint Cards to the State central repository, including times ofsubmission.

7. Does the agency need additional copies of the State central repository’sInstructions for completing and submitting Arrest Fingerprint Cards and Custodial Fingerprint Cards?

Yes

No

III. USE AND DISSEMINATION

1. Please provide the number and location of telecommunications terminals inthe agency.

2. Who has access to the terminals and for what purposes?

Page 62: Criminal Justice Information Policy · assessed by comparing sample record entries drawn at random from the criminal history database with incoming reporting documents or computer

Audit Guide for Assessing Completeness and Accuracy of Criminal History Record Systems 65

3. Does the agency make criminal history record information received from the[State central repository] available to any noncriminal justice persons oragencies?

Yes

NoIf so, describe the procedure and indicate who has access and for whatpurposes.

4. Please describe the logs kept of extra-agency disseminations of criminalhistory record information received from the State central repository, andindicate how long they are kept.

IV. SUBJECT ACCESS/REVIEW

1. Are agency officials familiar with the Administrative Regulations permittingrecord subjects to obtain and review their criminal history records andchallenge the accuracy and completeness of the records?

Yes

No

2. Are agency officials familiar with the role of arresting and correctionalagencies in the review/challenge process?

Yes

No

3. Does the agency have sufficient copies of the regulations and applicableforms for subject access/challenge?

Yes

No