Building a Practical Data Quality Management...
Transcript of Building a Practical Data Quality Management...
Building a Practical Data Quality Management Solution ABSTRACT This presentation will describe the components of an approach to identifying and resolving the "bad data" problems that inevitably emerge as hospitals strive to become capable of implementing data driven patient care performance improvement initiatives. Incomplete or inaccurate data can compromise any attempt to motivate clinicians to modify patterns of care as higher levels of quality and efficiency are sought. Hospital leaders need to be certain that the measures they use to manage clinical quality improvement are based upon reliable data. Inaccurate data sent by hospitals to external agencies for benchmarking purposes can seriously compromise the validity of such efforts. Analyzing the business processes behind "data supply chains" provides a starting point to understanding how to identify and fix the root causes of these problems. The Cedars-Sinai team will share the practical approach they have developed to identifying and prioritizing data quality problems for resolution, and then applying the methods and tools of performance improvement to fix them. Presenters will discuss building a data quality culture, implementing a systematic approach to data quality incident reporting and resolution, developing the clinical statistical information initiative to proactively approve data prior to external submission, and the evolution of a data quality dashboard for electronic health record implementation. BIOGRAPHY Bruce N. Davidson Director of Resource and Outcomes Management Cedars-Sinai Health System Bruce N. Davidson, Ph.D., M.P.H. is Director of Resource and Outcomes Management for Cedars-Sinai Health System, a position he’s held since 1996. He leads a department of 23 in the development and implementation of initiatives to promote cost-effective, high quality medical care. He is also an Adjunct Assistant Professor in the Health Services Department at the UCLA School of Public Health, teaching Quality Improvement and Informatics for the Executive Masters Program. Dr. Davidson has 30 years of hands-on experience in leading, supporting, and evaluating patient care process improvement initiatives, as well as the delivery of patient care services in both inpatient and outpatient settings. He has published in the areas of medical treatment effectiveness, decision-making in health care, and measurement for quality improvement, with a recent focus on information management. His PhD in Health Services Research and his Masters in Public Health are from UCLA and his Bachelors is from MIT.
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
2
Alein T. Chun Manager of the Data Quality Measurement Unit Cedars-Sinai Health System Alein T. Chun, Ph.D., M.S.P.H. is the Manager of the Data Quality Management Unit (DQMU) at Cedars-Sinai Health System. He is responsible for the day-to-day operation of the enterprise DQM function. He and his staff of four manage an assortment of activities related to both internal reporting and the release of clinical and administrative data to outside organizations. Essential data quality control activities include creating standard operating procedures for managing high priority data elements, solving critical data problems, validating key data and reports, and assuring quality of data released to outside entities. DQMU also acts as facilitator and change agent in business process improvement across the data supply chain business units. Dr. Chun holds a Ph.D. in Health Services and a Master’s degree in Public Health both from UCLA. Ann Chu Team Lead, Data Quality Management Unit Cedars-Sinai Health System Ann Chu, M.H.A. is the Team Lead of the Data Quality Management Unit (DQMU) at Cedars-Sinai Health System. She works together with her team of analysts and programmers in activities related to improving data quality in the organization. Such activities include managing and resolving data quality incidents, validating major database enhancements and fixes, validating key reports that are submitted to outside agencies, educating users on data quality and the correct use of data, and improving data and reports through process improvement initiatives. Ann has 10 years of experience in IT and data quality management. She holds a Master’s degree in Health Administration from the University of Southern California. Thai Lam Senior Analyst Data Quality Management Unit Cedars-Sinai Medical Center Thai Lam is currently a Senior Analyst in the Data Quality Management Unit at Cedars-Sinai Medical Center, where he has worked since 1998. He has worked in both business and IT roles focusing on data quality management, business intelligence, reporting, and financial modeling.
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
3
Building a Practical Data Quality Management Solution for an Academic Medical CenterBruce Davidson, PhD, MPH, DirectorAlein Chun, PhD, MSPH, ManagerAnn Chu, MHA, Team LeadThai Lam, Senior Analyst
Data Quality Management Unit,Resource & Outcomes ManagementJuly 13, 2011
© 2011
The Context
Cedars-Sinai Medical Center
Academic Medical Center/Health SystemLargest Non-Profit Hospital in the Western US958 Beds, 10,000 Employees, 2100 MDsBasic Annual Statistics
— 57,000 inpatients— 565,000 outpatients— 82,000 ER visits— 7,000 deliveries
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
4
Context: Academic Medical Center Data Management Implications
Complex, data-intensive organization
Distributed oversight responsibilities
Transactional data systems populated as byproduct of patient care
Data managed as departmental resource rather than as enterprise resource
© 2011
Resources Available to Carry Out Mission
MISSION
Patient careTeachingResearchCommunity Service
RESOURCES
PeopleMoneyEquipment
Data
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
5
Patient Care, Controlling Costs & Quality Improvement ALL depend on DATA
DATA QUALITY MANAGEMENT
Putting in place the management functions that will proactively assure the
availability of data that meets user-defined
standards of completeness, accuracy, timeliness, and
accessibility
© 2011
The Road to Data Quality
Adopt a customer/consumer perspective• High quality data = data that is fit for use
Manage data as a product• Understand the consumer’s data needs• Manage data as the product of a well-defined
production processDefine Data Quality beyond accuracyAdopt and adapt classical PI/QI principles
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
6
1997 - DPG convened by CEO1998 - DQMWG started by DPG with ROM as Chair; IQ Survey #11999 - DQM Objectives in Annual Plan; IQ Survey #22000 - DQM Objectives in Annual Plan; OSHPD Submission DQ Improvement Project2001 - DQM Objectives in Annual Plan; DPG & DQMWG Charters renewed; IQ Survey #32002 - CEO designates ROM as “clearinghouse” to approve all clinical statistics reported out of CSMC2003 - DPG initiates Data Warehouse Improvement Project (DWIP) following no P&Ls for 8 months
Our Road to Data Quality at Cedar-Sinai
2004 - DPG funds DQMU in ROM, DWIP implemented; IQ Survey #42005 - DWIP Objectives in Annual Plan; DQMU staffed and DQM Program formalized; DQ Incident Reporting & Resolution framework established2006 - DQM Objectives in Annual Plan & linked to executive incentive compensation; High Priority Data Element Certification Program initiated2007 - Internal Audit initiates focus on minimizing risk due to defective data; DPG sets “Task Force” model to resolve specific DQ issues, IQ Survey #52008 - DPG & IA direct DQMU to lead Clinical Statistical Information initiative2009 - 2011 - 438 DQ Incidents logged as of May 2011 of which 401 have been closed; CS Link Implementation presents new challenges
© 2011
Data Quality Management Business Model
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
7
Components of Data Quality Management Program
Data Governance — Cross-functional committee make-up, structure, policy-setting &
problem resolution functions (DPG & DQMWG)
Data Quality Management Unit— funding and staffing to support the work
Data Quality Management Unit Activities— putting out fires and pro-actively controlling data flow
Data Certification— standardization of high priority data elements, stewardship
Communications— committees, website, training and certification, surveys
© 2011
Sponsor for Data Quality Initiative
Data Provider Group
Convened by CEO as governance group to resolve ongoing data discrepanciesExecutive Sponsors: Senior Vice PresidentsCross-functional membership includes Vice Presidents and Directors of all departments that provide data to the organizationChair - Resource & Outcomes Management
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
8
Task Force for Data Quality IssuesData Quality Management Working Group
Spun off as middle management task forceSurfaces data quality issues; recommends solutions; and raises them up for prioritizationRepresentation from data producers, data custodians, and data usersChair - Resource & Outcomes Management
© 2011
Staff Support for Data Quality ManagementThe Data Quality Management Unit
Dedicated staff working with all affected areas to orchestrate necessary solutionsStaffing Plan: Data Quality Manager, Team Lead, 2 Sr. Data Quality Analysts, and Database SpecialistResource & Outcomes Management oversees the DQMU
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
9
Pieces of the Puzzle
© 2011
Implementing Corporate Data Quality Management
Creating core functions
Creating governance structures
Creating communication channels
Putting Out Fires
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
10
Laying Data Quality Foundation
© 2011
Expanding Core Functions
© 2011
External Data Risk Mitigation Function— Regulate release of clinical statistical information— Conduct clinical data quality audit
Proactive Quality Assurance Function— Monitoring data in corporate data warehouse— Validating data corrections in data warehouse— Validating management reports— Performing user acceptance testing on new applications
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
11
Data Quality Incident Reporting
© 2011
Data Users or Management
Source Dept
Solution
Data error?Error
confirmed
DQMU investigates
Data Quality Incident Reporting
© 2011
Standard Operating Procedure
Aims to standardize approach to reporting, handling, and addressing data quality issues12-step process: from incident investigation, reporting & resolution, to monitoring400+ incidents since DQMU inception in 2005Cross disciplines and departmentsMultiple ways to report
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
12
Data Quality Incident Reporting
© 2011
Monitoring
Started Nov 2010
More proactive approach to deal with recurring data quality issues
About 15 monitoring routines in place
Errors sent directly to source department on a regular basis for resolution
Data Quality Incident Reporting
© 2011
What We Track
Incident name/descriptionData categoryReported byError source department, error typeAssigned to, statusNumber of affected recordsTable and fields affectedPlus more…
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
13
Data Quality Incident Reporting
© 2011
Metrics
Number of new incidents
Number of open incidents
Number of closed incidents— By month— By category— By error source department— New/recurring
Data Quality Incident Reporting
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
14
Data Quality Incident Reporting
© 2011
The Technical Details
Started with MS Access database in 2005, converted to Oracle in 2009
Kept MS Access front-end
About 10 tables
The DQMU Info Center
Purpose
Provide a platform to keep data users informed on data quality
Provide knowledge tools to report writers and management to help them become proficient in using the data
Allow users to conveniently report new data quality incidents
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
15
The DQMU Info Center
© 2011
IncidentsData quality issues
reported by data users
Data DictionaryInformation about data
such as meaning, origin, usage, and
format
Reporting TipsHints for accurate
report writing
CrosswalksInformation on
mapping of values between systems
Information Products
The DQMU Info Center
Evolution: Before
Incidents - data quality issues already being tracked in a local database; new incidents published monthly via email
Data Dictionary and Crosswalks - available on intranet site
Report new incident - via email
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
16
The DQMU Info Center
Evolution: After
“One Stop Shop” - all information products available in one locationWeb-based - accessible to anyoneSearchableAbility to view incident status and findingsNew component - Reporting TipsReport New Incident - available onlineNew Incident Alert
© 2011
The DQMU Info Center
© 2011
Info Center Main Screen
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
17
The DQMU Info Center
© 2011
Info Center Search Results - Incidents
The DQMU Info Center
© 2011
Info Center Search Results – Reporting Tips
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
18
The DQMU Info Center
© 2011
Info Center Search Results – Data Dictionary
The DQMU Info Center
© 2011
Info Center Search Results – Crosswalk
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
19
The DQMU Info Center
© 2011
Info Center Report New Incident
The DQMU Info Center
© 2011
Info Center New Incident Alert
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
20
The DQMU Info Center
© 2011
Chart
The DQMU Info Center
The Technical Details
Oracle Database: made use of the department’s existing database platform
ASP.Net: web framework used to build the site. Made use of the department’s existing web server
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
21
The Clinical Statistical Information (CSI) Initiative
Defining Clinical Statistical Information
Clinically-oriented data or statistics about specific clinically-defined patient populations or individual patient records containing clinical data, ANDThe above data or statistics become available in the public domain or for use by an outside organization
This definition does not apply to the following: (a) billing data sent to payers; (b) research data under IRB; and (c) financial, operational, or administrative data that are sent externally without any Clinical Statistical Information.
© 2011
The Clinical Statistical Information (CSI) InitiativePolicy Evolution
In 2002, a memorandum from the Medical Center’s President and CEO specifically required all Medical Center departments to forward requests for data for public reporting of CSI to the Resource and Outcomes Management (ROM) Department for review and approval. In 2008, Internal Audit identified improvement opportunities in policies and procedures for accumulating and submitting CSI to outside organizations. ROM was charged to follow up with Medical Center departments who have not obtained proper approval for CSI submissions to external organizations.
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
22
The Clinical Statistical Information (CSI) Initiative
Policy Evolution (cont’d)In 2008, corporate policy and procedure implemented governing all CSI released to outside organizations to be systematically prepared and formally approved prior to release.
— The overarching purpose of this policy and procedure is to minimize any such risks, by minimizing the possibility that inaccurate orincomplete Clinical Statistical Information is released to outside organizations.
In 2010, ROM Clinical Data Quality Audit for key data elements.
— Data quality audit aims to ensure data integrity in the input, process, and output data prior to data submission to external organizations.
© 2011
The Clinical Statistical Information (CSI) Initiative
© 2011
Clinical Data Production Process
Interaction between Patient & Clinical Provider
Documented Clinical
Data (paper or electronic
medium)
Abstracted Data for External
Use
Submitted Data
Provider(document)
Abstractor(verify)
Data Coordinator
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
23
The Clinical Statistical Information (CSI) Initiative
Basic Evaluation of Business Process for Data Production
Create detailed business/data process flow diagram from initial data collection to final approved data for external releaseIdentify key requirements in the process
—Data elements collected?—Patient selection criteria, inclusions and exclusions?—Primary data collector?—Who produces the approved final data?—When is the data collected, reviewed, edited and produced?—How is the data collected? Paper, database, web-based, etc
© 2011
The Clinical Statistical Information (CSI) Initiative
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
24
CSI Clinical Data Quality Audit Scope
The Clinical Statistical Information (CSI) Initiative
© 2011
Clinical Data Production Process
Interaction between Patient & Clinical Provider
Documented Clinical
Data (paper or electronic
medium)
Abstracted Data for External
Use
Submitted Data
Provider(document)
Abstractor(verify)
Data Coordinator
The Clinical Statistical Information (CSI) Initiative
Data Quality Audit
Notify clinical data submission client of data quality audit
Develop data quality audit strategy and data control plan—Input testing: identify and document data input methods —Process testing: track flow of specific data element and/or record from
source through submission—Output testing: validate representative sample of submitted data against
source data
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
25
The Clinical Statistical Information (CSI) Initiative
Data Quality Audit (cont’d)
Present plan to client to establish agreementExecute testing plan with clientCompile test resultsReport test results and significance of findings to client Recommend if necessary opportunities for improvement to clientFollow-up with client on improvement plan
© 2011
The Clinical Statistical Information (CSI) Initiative
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
26
Quality Assurance Activities
CSI submission: UHC clinical database data extracts
User Acceptance Testing: Physician profiling application
Rule-based data monitoring: Corporate Data Warehouse
Report validation: ROM external reporting
EHR validation: CS-Link Data Quality Dashboard
© 2011
CS-Link Data Quality Dashboard Timeline
CS-Link Quality and Regulatory Data Committee — Chartered June 2009— Broad membership
CS-Link Data Quality Dashboard Working Group— Initiated December 2010— Membership includes representatives from CS-Link Clinical
Documentation Build Team and CS Data Quality Management Team
Development Currently Underway— Pilot Dashboard— Scalable Approach
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
27
CS-Link Quality & Regulatory Data Committee
Committee ObjectivesIdentify data elements necessary for Medical Center functions that should be available in CS-LinkProvide a multi-disciplinary forum for review of CS-Link clinical content for purposes of Quality/Safety, resource management and Regulatory guidelinesHelp ensure that CS-Link designed clinician documentation will support abstracting and codingEnsure Quality and Core Measure reporting needs can be supported with CS-Link documentation toolsEnsure all licensing requirements are maintained with CS-Link
© 2011
CS-Link Quality & Regulatory Data Committee
Key ActivitiesInventory of currently reported and anticipated data related to clinical care needed for public reporting or internal quality management
Review CS-Link clinical content to ensure the support needs identified above
Note where changes/modifications are needed, provide feedback to application team, ensure changes are made.
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
28
CS-Link Quality & Regulatory Data Committee
Key Activities (cont’d)
Provide a consistent data quality perspective for content reviewand oversight
Own the migration of current data abstracting related to quality, safety, regulatory and core measure reporting to CS-Link
Develop and update dashboard of measures to reflect progress
© 2011
Charge for CS-Link Data Quality Dashboard Working Team
For each key Quality Council Dashboard Measure, develop method for evaluating data quality at three points of data lifecycle
— ensuring input data quality— ensuring internal logic data quality— ensuring extract data quality
If all three are “green,” then leadership can be confident that the measure is an accurate representation of performance
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
29
CS-Link Data Quality Dashboard: Progress
Focus of Pilot Determined— VTE Prophylaxis for ICU patients that are part of the VAP Bundle— Will be used to develop a standardized process for evaluating data
quality of other key Quality Council measures derived from CS-Link.
The build team had completed, but not released, a new build for VAP Bundle, including VTE Prophylaxis, due to perceived data quality problems with initial build released with IP2.
© 2011
Process for Evaluating Input Data Quality
Identify data elements needed to operationalize VAP ICU VTE Prophylaxis measureDevelop Data Acquisition Workflow to document how required data elements are input into CS-LinkEnsure new build will cover “gaps” by comparing:
— Original CareVue data flow (believed to be correct)— Current CS-Link build (believed to be problematic)— Redesigned CS-Link build (believed to be correct)
Develop and test reports to allow ongoing assurance of continuing data input integrity
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
30
Process for Evaluating Internal Logic Data Quality
Evaluate how each data element needed to operationalize VAP ICU VTE Prophylaxis measure:
— flows from point of entry into CS-Link through the various internal CS-Link environments
— until it reaches the CS-Link Clarity data base (from which it will be extracted for the Quality Council Dashboard)
Ensuring integrity by highlighting any decision-points, programmed transformations, or calculations implemented during this processDevelop and test reports to allow ongoing assurance of continuing internal logic data integrity
© 2011
Process for Evaluating Extract Data Quality
Evaluate how each data element needed to operationalize VAP ICU VTE Prophylaxis measure:
— is extracted from CS-Link Clarity data base for use by the Business Objects team to construct the measure in the Quality Council Dashboard
Ensuring integrity by highlighting any decision-points, programmed transformations, or calculations implemented during this processDevelop and test reports to allow ongoing assurance of continuing data extract integrity
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
31
CS-Link Data Quality Dashboard: Process Flowchart
© 2011
Data Validation & Risk Management
Prepared by: Data Quality Management UnitResource & Outcomes Management, 4/20/11
2© 2011
Level / Approach Method
Risk Management
Effectiveness*Data Access Requirement
Currently Applied
1. Sample-based Spot check few cases Low Application, front-end Yes
2. Population-based Aggregated, ad hoc query Medium Database, back-end No
3. Proactive monitoring Rule-based, ongoing HighContinuous;Database, back-end No
* Special Case: Report validation
Report specs verification & content validation High
Report specs; Database, back-end
No (specs & spot check only)
Prepared by: Data Quality Management UnitResource & Outcomes Management, 4/26/11
*Effectiveness is based on low to zero risk tolerance for error.
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
32
Concluding Thoughts on EHR Validation
We believe we have a good working model
We have established communication between the build team and the data quality management team
There are limitations to the degree that this process can mitigate risks
We will be challenged to implement this process for multiple measures simultaneously
© 2011
LESSONS LEARNED
Simplify the message and repeat it constantlyUsable data = Meaningful data (the user has the final say)Start with small projects and try to build on existing efforts to get early winsDQM work is inherently collaborative: governance, working groups, and problem resolution must include sufficient cross-functional representationCreate a sustainable program, not just a short-lived project, by establishing and communicating standard routinesLongstanding data problems usually result from divergent business objectives among data producing departments; the technical solution is usually the smaller hurdle.
© 2011
The Fifth MIT Information Quality Industry Symposium, July 13-15, 2011
33