13ehrchallenges Report

download 13ehrchallenges Report

of 23

Transcript of 13ehrchallenges Report

  • 8/12/2019 13ehrchallenges Report

    1/23

    A Study Of TheImpact Of MeaningfulUse Clinical QualityMeasuresFloyd Eisenberg, MD, MPH FACP (iParsimony LLC)

    Caterina Lasome, PhD, MBA, RN, CPHIMS, LTC(Ret), USA (iON Informatics)

    Aneel Advani, MD, MPH (everis USA)

    Rute Martins, MS, Certifed Lean-Six Sigma Greenbelt (The Joint Commission)

    Patricia A. Craig, MS MIS (The Joint Commission)

    Sharon Sprenger, RHIA, CPHQ, MPA (The Joint Commission)

  • 8/12/2019 13ehrchallenges Report

    2/23

    The findings of this report reflect the views of the site visit participants and not

    necessarily the views of the authors or their respective organizations.

  • 8/12/2019 13ehrchallenges Report

    3/23

    Executive Summary.................................................................................................................................................. 4

    Background on MU eCQM Measures and Program ...................................................... 6

    Methodology ........................................................................................................................................................................ 7

    Site selection ....................................................................................................................................................................................7

    Data collection ................................................................................................................................................................................8

    Description of the Sites ...................................................................................................................................... 8

    Commitment to health IT as infrastructure to enable change ................................................................

    8A culture of empowerment and continuous improvement .................................................................................... 9

    Strong leadership driving health IT-enabled quality improvement ............................................................. 9

    Common eCQM Implementation Process Used Across Sites....................... 10

    Gap analysis ................................................................................................................................................................................................................... 10

    Data capture and workow redesign ............................................................................................................................................... 11

    Data extraction and eCQM calculation ......................................................................................................................................... 11

    Validation ............................................................................................................................................................................................................................. 12

    Findings ..................................................................................................................................................................................... 12

    Program design challenges ........................................................................................................................................................................ 14

    Technology challenges ...................................................................................................................................................................................... 15

    Clinical challenges .................................................................................................................................................................................................. 16

    Strategic challenges ............................................................................................................................................................................................. 18

    Summary table of ndings ............................................................................................................................................................................ 19

    Takeaways and Recommendations ..................................................................................................... 21

    Conclusion.............................................................................................................................................................................

    22

    References ............................................................................................................................................................................23

    A Study Of The Impact Of MeaningfulUse Clinical Quality Measures

  • 8/12/2019 13ehrchallenges Report

    4/23

    4

    Executive Summary

    In adopting electronic health records (EHRs) toimprove clinical care and patient health outcomes,American hospitals have invested tremendousnancial and human resources. Hospitals

    have undertaken this task in the belief that thetechnology would support the development ofautomated clinical quality reporting and assist theirlocal quality improvement initiatives. Based on theexperience of the four hospitals in this case study,the current approach to automated quality reportingdoes not yet deliver on the promise of feasibility,validity and reliability of measures or the reductionin reporting burden placed on hospitals.

    Measuring the ability to provide high-qualityhealth care provides hospitals with a baselinefrom which they can improve their performance.

    However, the accelerating volume of hospitalquality measures, and the linkage of qualitymeasurement to payment, underscores the needfor quality measure reporting solutions that balancethe value of quality measurement and reportingand the burden associated with multiple reportingrequirements. HITECH authorized the creation ofthe Medicare EHR Incentive Program to advancethe use of automation in health care, includingthe use of EHRs to report quality measures. EHR-based automated quality measure reporting hasthe potential, if implemented correctly, to ease

    the burden of quality reporting, while increasingaccess to real-time information to support qualityimprovement in patient care. To do so, the qualitymeasure results generated from EHR data must bebased upon information that is feasible to collect inan automated fashion, generate valid and reliableresults, and thereby demonstrate a benet thatoutweighs the costs.

    The American Hospital Association (AHA)commissioned iParsimony to conduct a studyto investigate hospital experiences with

    implementation of the Medicare Electronic HealthRecord (EHR) Incentive Programs Meaningful Use(MU) Stage 1 electronic clinical quality measures(eCQMs). This study describes the experiencewith and impact of eCQM implementation infour hospitals large and small, urban and non-metropolitan - each of which has signicantexperience with EHRs and uses a different EHRfrom a different vendor company.

    The ndings described in these case studies arebased on interviews conducted with key leadersand operational staff directly involved in theoversight and management of eCQMs. The MUStage 1 program for Eligible Hospitals and CriticalAccess Hospitals (CAHs) included 15 eCQMs

    that reect care for patients with stroke or venousthromboembolism (VTE) or blood clots, as well asthose visiting the emergency department (ED).Across the 15 measures, more than 180 individualdata elements are needed.

    CASE STUDY SITES

    Each of the organizations visited was well situatedfor success in eCQM adoption. Each showeda strong commitment to health informationtechnology (IT) as a means to enhance its abilityto provide safe, effective, efcient, timely, patient-centered and equitable care. Each was externallyrecognized by national programs for its level ofEHR adoption, with signicant efforts precedingthe MU program by ve to 10 years. All exhibiteda culture of clinician empowerment with strongleadership for using health IT to enable qualityimprovement. There was a common initial beliefamong the organizations that their signicant EHRefforts should lead to a relatively straightforwardprocess to implement the eCQMs.

    FINDINGS

    The case study hospitals and health systems werecommitted to the implementation of eCQMs as partof their overall quality improvement goals. Theyexpected to use eCQMs and MU implementationas key tools to achieve their broader quality goals.Specically, they expected to:

    1.Generate quality data from the EHR. Eachhospital sought to use certied vendor software

    to capture required data elements for reportingquality metrics as part of an organization-widecommitment to high-level use of EHRs.

    2.Use all of their quality data, whethergenerated through their eCQM tools or othermechanisms, to improve care by sharing thedata with physicians and other clinicians, andempowering them to continuously improve theefciency and effectiveness of care.

  • 8/12/2019 13ehrchallenges Report

    5/23

    5

    3. Use the EHR for clinical decision supportrelated to eCQMs. Each hospital plannedto incorporate clinical decision support toencourage clinicians to deliver care consistentwith guidelines (as captured in the measures).

    Through the implementation process each hospitaland health system encountered signicantchallenges to meeting those goals and hadto create many adaptive workarounds. Thesechallenges are outlined below.

    .........................................................................

    Program Design Challenges: The eCQMspecications were difcult to access, complex,contained inaccuracies and were not maintainedover time, creating confusion and additional work:

    Unclear specications and lack of policyinfrastructure caused multiple iterative updatesof vendor eCQM reporting tools (specializedsoftware or vendor supplied queries thatare certied for the MU program to captureinformation and report the eCQM results).

    Iterative changes in regulatory guidance led toextensive re-work to update tools.

    The program required use of new and unfamiliarvocabularies to dene required data.

    .........................................................................

    Technology Challenges: The eCQM tools fromvendors did not work as expected, and could notefciently generate accurate measure results:

    Hospitals experienced signicant difcultyimplementing eCQM tools in their EHRs.

    The EHR could not draw relevant data from othersystems.

    .........................................................................

    Clinical Challenges: The eCQM implementationprocess negatively affected clinicians, adding totheir workload with no perceived benet to patientcare as it duplicated information already entered innarrative text. The process also failed to generate

    usable data to support improvement efforts:

    eCQM reporting tools were poorly aligned withclinical workow, necessitating the redesign ofthe patient care systems or the re-tooling of thereporting tools.

    Validation efforts were extensive, but not

    successful.

    Clinical staff did not trust the data.

    Rigid regulatory requirements caused the eCQMsto be out-of-date and out of step with advances incare; updates were available late in the processbut were difcult to nd and optional for vendors

    to incorporate.

    .........................................................................

    Strategic Challenges: Hospitals expendedexcessive effort on the eCQMs that negativelyaffected other strategic priorities:

    Hospitals saw little to no return on investment.

    Extensive efforts delayed other planned projects.

    POLICY IMPLICATIONS ANDRECOMMENDATIONS

    Specic policy changes are needed to addressthe challenges hospitals face as they move toelectronic quality reporting:

    1. Slow the pace of the transition to electronicquality reporting with fewer but better-testedmeasures, starting with Stage 2. The additionaltime would allow:

    Policymakers to create a reliable policyprocess for eCQM implementation, amechanism to provide eCQM updates, and arobust EHR testing/certication program;

    Vendors to develop tools that support logicalworkows, produce accurate measures andleverage all data already in the EHR; and

    Hospitals to implement the tools in a waythat supports their quality goals withoutexcessive burden or risk to patients.

    2. Make EHRs and eCQM reporting tools moreexible so that data capture can be alignedwith workow and interoperable so that data

  • 8/12/2019 13ehrchallenges Report

    6/23

    6

    can be shared across hospital departmentsystems.

    3. Improve health IT standards for EHRs andeCQM reporting tools to address usability anddata management to achieve MU program

    expectations.

    4. Carefully test eCQMs for reliability and validitybefore adopting them in national programs.Implement eCQMs within hospitals as part oftesting to ensure information ow is accurateand there is no adverse impact on quality andpatient safety.

    5. Provide clear guidance and tested toolsto support successful hospital transitionto increased electronic quality reportingrequirements. For example, develop and

    disseminate an accurate, complete andvalidated crosswalk from SNOMED-CT, avocabulary required by the 2014 eCQMs andMU Stage 2, to ICD-10-CM for conditions andICD-10-PCS for procedures.

    Background on MU eCQM

    Measures and Program

    In the last 10 years, there has been anunprecedented expansion in the number and type

    of quality measures on which hospitals are requiredto report for federal programs. The Centers forMedicare & Medicaid Services (CMS) HospitalInpatient Quality Reporting (IQR) and Value-BasedPurchasing (VBP) programs are two of the driversof this development. The Hospital IQR programwas created by the Medicare Prescription Drug,Improvement, and Modernization Actof 2003(MMA).The Decit Reduction Act of 2005(DRA) expandedquality reporting requirements for hospitals andprovided the Department of Health and HumanServices (HHS) with the discretion to add and

    replace quality measures. In the hospital VBPprogram, included in theAffordable Care Act of

    2010(ACA), CMS expanded the use of the IQRquality measures by making value-based incentivepayments to acute care hospitals, based either onhow well the hospitals perform on certain qualitymeasures or how much the hospitals performanceresults improve over time. The hospital VBPprogram is applicable to payments for dischargesbeginning October 1, 2012.1 Measure requirementsfor VBP include: dening the end goal rather than

    processes, aligning provider incentives, enablingrapid-cycle measure development with attention topatient-centered measures, and support for qualityimprovement.2 In 2013, the number of measuresrequired for the IQR Program increased to 54 (from10 in 2004), and non-reporting hospitals will have

    their Medicare inpatient payments reduced by 2percent.3

    Currently, hospitals report nearly 90 measuresacross all hospital quality reporting programs,including hospital inpatient and outpatientreporting. The measures are generally based onchart-abstracted data, supported by varying levelsof automation. Some data required for measurereporting are straightforward, such as patient age,while others are challenging to collect, such asstructured documentation that specic actions

    were taken (e.g., documenting that compressionstockings were placed on the patient).

    Many have considered EHRs as a rich source ofinformation to improve the care patients receive,and also to measure the quality of the careprovided. EHRs have been shown to enhance theefciency of care and reduce errors by improvingaccess to information such as laboratory resultsand medications administered, and by enhancingpatient safety related to medication orders.4Computerized provider order-entry (CPOE) alsohas reduced prescribing errors, and using clinicaldecision support (CDS) has been shown tocontribute to a decrease in adverse drug events of41 percent.5

    EHR-based automated quality measure reportinghas the potential, if implemented correctly, to easethe burden of quality reporting, while increasingaccess to real-time information to support qualityimprovement. To do so, the quality measureresults generated from EHR data must be basedupon information that is feasible to collect in anautomated fashion, generate valid and reliable

    results, and thereby demonstrate a benet thatoutweighs the costs. The Medicare and MedicaidEHR Incentive Programs, created in the HealthInformation Technology for Economic and ClinicalHealth Act of 2009 (HITECH), authorized HHS toprovide nancial incentives to hospitals and eligibleprofessionals for the meaningful use (MU) ofcertied EHR technology to improve patient care.6

    The successful demonstration of meaningful useincludes using a certied EHR to meet speciedthresholds for a number of objectives and attesting

  • 8/12/2019 13ehrchallenges Report

    7/23

    7

    that the certied EHR is capable of electronicallygenerating and reporting clinical quality measures(CQMs).

    The 15 CQMs selected for hospital reporting inthe MU program Stage 1 are a subset of CQMs

    included in the hospital IQR program that reectcare for patients with seven stroke, six venousthromboembolism (VTE, or blood clots), andtwo measures on emergency department (ED)throughput. This reporting is in addition to otherMedicare quality reporting requirements. The MUCQMs did not replace the original chart-abstractedCQMs from which they were re-tooled. The CQMschosen for the MU Stage 1 program were originallydesigned for collection by staff trained to reviewpatient records and identify the data needed togenerate the measure reports. The process is

    generally called chart abstraction, and measuresdesigned for such a process are called chart-abstracted measures. The CQMs in meaningfuluse were modied to extract data directly fromEHRs rather than require chart abstraction.Measures modied in this way, calledre-tooledmeasures or eMeasures, were subsequentlyrenamed eCQMs to indicate the electronic nature ofdata collection.

    The measures were re-tooled using the rst,untested draft version of the Health Quality MeasureFormat(HQMF) standard developed by thestandards organization Health Level Seven (HL7)in 2009. In addition, to provide consistency, the re-tooling process used a common set of denitions,or terms, called the Quality Data Model (QDM),developed by the National Quality Forum (NQF)7

    to help describe the data needed for the measures.The vocabularies for the measures were denedby the organization that performed the retooling,the Health Information Technology StandardsPanel (HITSP). Vocabularies for all new eCQMs(including those in MU Stage 2) are nearly thesame as designated by the Health IT Standards

    Committee and are listed in the data elementcatalogue of the National Library of Medicine(NLM).8 [Refer to the Appendix for detailedexplanation of the purpose and background of theQDM and vocabulary usage.]

    Across the 15 measures, more than 180 dataelements are required. However, the eCQMs werenot robustly tested to determine if all of the dataneeded were available in existing EHRs. Accessto the eCQM specications and associated

    vocabulary codes was difcult. A clear method toobtain clarication and provide feedback to CMSwas slow to develop and advice for implementationwas not readily available.

    The rapid meaningful use policy development and

    EHR and eCQM certication process includeda number of factors that likely have affected theaccuracy of the eCQM measure results. Forexample, the eCQMs provided by CMS containedknown errors and were never implemented withinEHRs to test the feasibility of data collection orvalidation of results. Further, the EHR and eCQMreporting tool certication process was limited towhether the product could calculate the measures.It did not address whether the expected data wereactually present in EHRs or that the calculation wasaccurate. Moreover, the EHR certication process

    does not ensure EHR vendors accommodateclinical workow processes in eCQM data capture.

    Methodology

    The project team consisted of six experts in qualitymeasurement and health IT with experience inCQM measure development and/or implementationof measures.

    Site selection

    The four hospitals selected for site visits met thefollowing criteria:

    already attested to meaningful use;9

    accredited by The Joint Commission; and

    not classied as critical access hospitals.

    Additional variables used to select sites and ensurediversity among sites included:

    representation of a distinct EHR vendor at eachsite;

    afliation (inclusion in a health system vs. stand-alone facility);

    geographic location; and

    bed size and setting (urban vs. rural).

    Some contingency factors that contributed to thenal roster included willingness of the organizationto participate in the study, as well as availability

  • 8/12/2019 13ehrchallenges Report

    8/23

    8

    of key staff for on-site interviews. The study teamselected sites in collaboration with the AmericanHospital Association (AHA).

    Data collection

    Once the sites were identied and hospitalleadership agreed to participate, the study teamcoordinated three main phases of data collectionwith each organization, including: 1) a pre-site visitdemographic survey; 2) a site visit with interviewsof key personnel involved with the MU program andthe implementation of eCQMs; and 3) a post-sitevisit data validation teleconference.

    Consistent with qualitative data collection methods,the study team created interview guides foreach of the primary study participant groups to

    be interviewed (facility leadership, informatics,quality, technology, nance and operational staff).Interviews were conducted in both individualsessions and groups (two to three staff membersfrom like departments) over a period of one to twodays. Study team members had pre-dened roles(facilitator, note taker, etc.) to ensure consistency inthe data collection process. For the facilities thatbelonged to a larger system, both hospital staff andsystem-level informants were interviewed.

    Upon completion of each site visit, the study team

    validated and documented all descriptive data forthe site and performed a content analysis on the

    interview notes. Post-visit conference calls wereheld with leaders interviewed in each organizationin order to validate data and ndings in eachrespective case study. A case study for each sitevisit was completed and informed the contents ofthis nal report. The ndings in this report draw on

    the experience of these four hospitals, and whiletheir experience is most likely shared by otherhospitals, this sample is not representative of theeld as a whole.

    Description of the Sites

    The four hospitals have strong, but variedexperience with EHR implementation. In addition,they all have strong leadership and organizationalcultures tightly coupled with the MU program visionof supporting and improving patient care andsafety with health IT. These qualities positionedeach of the four organizations for success in eCQMreporting.

    Commitment to health IT as infrastructureto enable change

    Each hospital had a history of robust EHR use witha vision to support quality, safety and efciency thatpredated the MU program. In fact, all organizationshad started their journey towards EHR adoption

    more than ve to 10 years before the MU programbegan. The high level of EHR maturity in each sitevisit hospital was nationally recognized.

    Activities supported by the respective EHRsincluded:

    1.CPOE (all four hospitals) entry of medicationand other orders for patients by a physiciandirectly into a the EHR using a computer ormobile device;

    2.Nursing documentation (all four hospitals)

    entry of patient assessments and ndings bynurses directly into the EHR; and

    3.Bar-coded medication administrationdocumentation (three of the four hospitals) using a bar-code reader to scan bar codeson the medication, the nurses badge and thepatients identication bracelet to automaticallyrecord in the EHR which medication wasadministered, at what time, by whom and towhom.

    Site demographics

    Meaningful use attestation:Three out of the foursites completed MU attestation in 2012 alone, while theremaining site attested in both 2011 and 2012.

    Affiliation:Two stand-alone hospitals and twohealth-system affiliated sites.

    Geographic location: North East, South Central,

    Midwest, South Atlantic.Bed size:Two of the facilities in the 100-199 bed sizerange, and two in the 300-399 bed size range.

    Setting:Urban and non-metropolitan.

    Financial classification: All of the sites arenot- for- profit.

    EHR:Each organization used a different EHR vendor.

  • 8/12/2019 13ehrchallenges Report

    9/23

    9

    Physician documentation, the ability of physiciansto enter structured and narrative notes directly intothe EHR, was available from the product vendor inmost cases, but was considered one of the morechallenging EHR components to implement. For allbut one hospital, it remains in the planning stage at

    the time of this report.

    Additional value to eCQM implementation wasderived from a hospital and/or enterprise-widedata warehouse to store data from multiple sourceswithin the organizational health IT ecosystem,including the EHR, departmental systems (e.g.,pharmacy, laboratory, emergency department,operating room and anesthesia applications),as well as administrative and nancial systems.While organizations were still in the early stages ofplanning or deploying data warehouses, all had the

    vision of leveraging such technology as a businessintelligence tool to support operations, includingclinical decision support and quality improvement.

    A culture of empowerment and continuousimprovement

    Each hospital has a culture of encouragingphysician and nurse participation in its qualityimprovement programs. Three organizationsprovide physicians and nurses direct access toperformance results; the fourth is still developing

    the feedback mechanism. Presenting informationwith sufcient detail allows physicians to knowwhat changes they can make in their practicesto improve overall performance. Three ofthe hospitals provided dashboard reports forphysicians and nurses to understand how theirperformance compared to peers with respect toquality measures in required external reportingprograms, in addition to whether it met internalorganizational objectives.

    Improving staff performance was a focus at each

    hospital. However, each site takes a differentapproach to include and empower staff (physicians,nurses and other staff) to improve quality. Three ofthe organizations provide quality report informationto physicians to help them perform better and tohelp fellow staff enhance performance. Leadershipat one of the organizations referred to the feedback

    process as activation by providing sufcientdetail for the staff to actively improve care for theindividual patient thus encouraging organizationalchange.

    Most of the organizations empowered staff by

    encouraging them to participate directly in projectsthat improve patient care and outcomes. Oneorganization pairs clinical staff with IT departmentcounterparts, enabling them to resolve issuesrelated to clinical workow, an important factor inthe data capture process needed for eCQMreporting. Another example of organizationalsupport to address clinical workow is the use ofagile LEAN approaches, such as Kaizen events1, togenerate change. In one such effort, nursesevaluated how they manage patient care plans andwhether clinical patient care goals are met. The

    Kaizen evaluated the use of the EHR based on 18months of previous work and led to a decision toreturn to a manual paper process to successfullyhandle patient care plans. Senior leadershipaccepted the proposal. There is now effort todetermine how the EHR can support the successfulmanual process rather than change the workow tot the EHRs structure.

    Strong leadership driving health IT-enabledquality improvement

    All organizations demonstrated a vision for theEHR as an essential tool to provide real-timeinformation access for clinical decision-making andto improve patient safety and clinical outcomes.This commitment was present at the board andsenior administrative leadership level including theChief Executive Ofcer (CEO), the Chief OperatingOfcer (COO) and the Chief Financial Ofcer (CFO).

    The support continues through the eCQMimplementation process including realigning staffand projects, and rapidly removing roadblocks

    to assure success in reporting and managingpatient care processes and outcomes. One or twochampions emerged at each site as the focal pointsfor success in EHR use, generally the Chief MedicalInformation Ofcer (CMIO) or the Chief MedicalOfcer (CMO), although organizational structure,titles and responsibilities varied.

    1. Kaizenis Japanese for good change. It is used in LEAN business process engineering to describe a three to ve day exercise to evaluate anexisting process, identify wasteful or duplicate steps and develop a new, streamlined process that can begin at the end of the exercise.

  • 8/12/2019 13ehrchallenges Report

    10/23

    10

    In addition, each organization has a strongquality department and IT staff leadership.The departments are tightly coupled. At eachorganization, quality department nurses workdirectly in the IT area to explain to the IT staffwhat information is required for measurement

    and for routine clinical care activities. The IT stafftranslates how the EHR and its components canmanage clinical details required by the eCQMs.One of the hospitals has formalized a position ofNursing Informatics Ofcer. In another hospital,quality staff report directly to the CMIO. Thequality staff is generally the group that coordinatesdiscussions between the clinical staff and the ITstaff. In three of the organizations, the qualitystaff performs concurrent review, a process bywhichquality department staff works directly withprimary care nurses during each patients stay

    to support appropriate documentation for qualityreporting, including but not limited to eCQMreporting. One hospital initiated the concurrentreview process to establish new, positive workingrelationships between the quality department staffand the clinicians providing direct care. None ofthe organizations considered the staff-intensiveconcurrent review process to be sustainable,however, as government programs require moreand increasingly complex measures.

    Common eCQM ImplementationProcess Used Across Sites

    As previously noted, all organizations adoptedEHRs prior to the MU program, and each used adifferent, nationally known EHR vendor.

    While the organizations approached eCQMimplementation differently, the eCQMimplementation required iterations of various steps,as depicted in Figure 1 below.

    A gap analysis compared measure requirements

    against data captured in the EHR as part of clinicaldocumentation (their workow). While the next stepincluded the extraction of the data from the EHR tocalculate the measure, the eCQM reporting toolsdid not necessarily look for the data based on theclinical workows the organizations designed. Inthe following step, the hospitals worked to validatethe measures reported. The eCQM workowprocess was not linear, but was an iterative processin which data capture and clinical workowredesign, data extraction, and eCQM reportingwere repeated frequently and inuenced by the

    validation process.

    A detailed explanation of how organizationsapproached the individual steps of the eCQMimplementation process is provided below.

    Gap analysis

    All organizations conducted a gap analysis,comparing measure requirements againstdata captured in the EHR as part of the clinicalworkow (the routine patterns of actions, including

    documentation clinicians use during patient care).The original expectation was that EHRs wouldcollect data through the routine care process andthat the eCQM reporting tool would accuratelyextract and report measurements from the existingdata, thereby reducing provider burden. The gap

    Figure 1

    Figure 1.High level steps in the eCQM implementation process, illustrating an iterative process wheredata capture and workow redesign, data extraction and eCQM reporting are interdependent steps, andare inuenced by the validation results.

  • 8/12/2019 13ehrchallenges Report

    11/23

    11

    analysis was based on the data requirements setforth by the certied eCQM reporting tool; however,at least one organization conducted the analysisagainst the requirements for the chart-abstractedversion of the measures. Regardless of theapproach, this step enabled the organizations to

    identify gaps in discrete documentation, specicallythe gap between the structured encoded datarequired by the measure and the data documentedas narrative text.

    Data capture and workow redesign

    The data capture and workow redesign step wasperceived as the most involved and onerous stepin the expected eCQM implementation workowprocess across all sites, regardless of EHRvendor. In order to accommodate the ndings of

    the gap analysis, including measure requirementsfor documentation of specic clinical informationin a discrete format, the organizations identiedthe need to modify their EHRs to support datacapture of information required by the measure.The modication often included the addition ofstructured elds in the EHR (such as checkboxesand pre-dened lists of values). This representedan additional effort on the part of the sites, inaddition to updates to the EHR infrastructure.

    The previous clinical workow included bothnarrative (free-text) documentation and structuredentry of data. The additional elds were addedto new or existing forms in the EHR to assure allmeasure data was entered in a structured form.

    The new or modied forms were disruptive toclinicians workow and increased time spent ondocumentation, rather than patient care. Moreover,the automatic extraction of the data for the eCQMsrequired dening a specic location where thedata would be automatically recognized by theEHR, in order to be used for eCQM calculation.The organizations addressed placement of thesestructured elds differently. Some created specicforms in the EHR to capture the required data.Others embedded such elds throughout existing

    EHR documents, conducting extensive analyticalwork to minimize disruption to the workow, whileallowing documentation in the required encodedformat. In one hospital, the EHR-designedworkow staff documented discharge medicationfor stroke as part of the discharge instruction form.

    The hospital had designed an efcient workowwhere documentation was part of medicationreconciliation at discharge, but the eCQM reportingtool was not programmed to nd the informationin that location. The solution required redesigningthe programming; otherwise, the physicians wouldhave had to document the medication twice.

    Due to the inexibility of the certied eCQMreporting tools, each hospital identied a need forworkow modications to capture required data inspecic locations. In addition, the organizations

    that conducted a gap analysis and modied theirEHR data capture prior to receiving their eCQMreporting tools had to revise their prior workowmodications yet again so that the data in theEHR specic forms would align with the certiedsoftware.

    One organization used the version of the measuresdesigned for manual chart abstraction and usedfor reporting under the hospital IQR program(CQMs rather than the eCQMs) to evaluate clinicalworkow and incorporated those versions of themeasures into their EHR. As an advanced EHRuser, this site generally worked to gather as muchdata as possible. Citing the difculty nding theeCQM specications from the CMS website asthe reason for this approach, this organizationchanged workow processes to accommodatetheir electronic interpretation of e-specications ofthe measures.

    Data extraction and eCQM calculation

    The hospitals and health system sites studied

    expected to rely on the eCQM reporting tools toperform the eCQM calculation. Organizations withintegrated EHRs (those that include departmentalsystems in the same database used to store thedata captured during clinical care in the EHR) hadfewer challenges than organizations with uniquedepartmental systems, regardless of whetherthese systems were from the same or differentvendors. Organizations with unique departmentalsystems reported absent or limited interoperabilityamong the multiple systems, some of which contain

    Its not good enough that you [the clinician]document it; we need you to document some

    place where we can capture it for reporting.

    -eCQM implementation team member on the constraints in data

    extraction for eCQM calculation and impact on clinical workow

  • 8/12/2019 13ehrchallenges Report

    12/23

    12

    information needed for eCQMs. Hence, duplicativeworkarounds using manual data entry for dataalready present in departmental systems havebeen implemented.

    One organization invested signicant time and

    resources perfecting the data crosswalks tomaximize the data the eCQM reporting toolcould nd automatically and to limit duplicativedocumentation by physicians and nurses. Yet,even minor changes the hospital may have madeto address local clinical workow and preferencesfor routine care delivery in the EHR product asdelivered by the vendor limited the ability of theeCQM reporting tool to nd needed data.

    Validation

    The organizations expended signicant effort tovalidate eCQM results derived from their respectiveeCQM reporting tools, but were ultimatelyunsuccessful. Validation included technicalvalidation and clinical validation:

    Technical validation included: 1) verifying that alldata required by the eCQM reporting tool couldbe captured in a discrete format in the EHR;and 2) verifying that the specic locations of thedata were visible by the eCQM reporting tool forautomated data extraction.

    Clinical validation included: 1) verifying the extentto which clinicians entered the discrete data usedby eCQM reporting tool to accurately representclinical care; and 2) verifying the validity of theeCQM calculation based on the captured data.

    Organizations reported that the technical validationwas an involved and iterative process. Whileworking with its vendor during the development ofthe eCQM reporting tool, one organization struggledto achieve technical validation of its eCQM resultsand did not achieve clinical validation. Two hospitalswere able to validate their technical ability to capturethe necessary data; however, the use of these dataelds was inconsistent and they did not achieveclinical validation. One hospital achieved technicalvalidation and did not directly compare the results ofthe eCQMs and the corresponding chart-abstractedmeasures from which the eCQMs were derived.

    As an ongoing step in the eCQM validationprocess, three organizations developed a staff-intensive and unsustainable concurrent review

    process to encourage documentation directly bynurses or order-entry by physicians. Assignedstaff was charged with continuously reviewingEHR documentation to identify missing data in theEHR that could impact eCQM results, making thecompleteness and accuracy of the data used for

    eCQM calculation dependent on staff review andmanual data input. As a result of the concurrentreview process, these organizations became morecomfortable that the data the eCQMs required werepresent in the required EHR elds, but it requiredmore ongoing staff work than expected. Onehospital found that the concurrent review processpromoted better working relationships between thequality department staff and the clinicians providingdirect care to patients.

    Findings

    Site visit experience

    The hospitals and health systems were committedto the implementation of eCQMs as part of theiroverall quality improvement goals. They expectedto use their eCQMs and their meaningful useimplementation as key tools to achieve their broaderquality goals. Specically, they expected to:

    1. Generate quality data from the EHR. Each

    hospital sought to use certied vendor softwareto capture required data elements for reportingquality metrics as part of an organization-widecommitment to high-level use of EHRs.

    2. Use all of their quality data, whether generatedthrough their eCQM tools or other mechanisms,to improve care by sharing the data withphysicians and other clinicians, and empoweringthem to continuously improve the efciency andeffectiveness of care.

    3. Use the EHR for clinical decision support related

    to eCQMs. Each hospital planned to incorporateclinical decision support to encourage cliniciansto deliver care consistent with guidelines (ascaptured in the measures).

    Despite extensive EHR implementation experience,each organization quickly learned that the eCQMimplementation would be more challenging thananticipated. The actual eCQM implementationrequired multiple iterations of workow redesign,data capture, eCQM calculation and validation.

  • 8/12/2019 13ehrchallenges Report

    13/23

    13

    Figure 2 depicts the actual eCQM implementationexperience.

    eCQM implementation followed the same processat each site, progressing from gap analysis toworkow to implementation and validation. All of the

    hospitals expended an extensive amount of effortto modify workow, create new data entry screensand use those screens. The majority of that work(80 percent) was performed by the hospitals; EHRvendors were able to support only 20 percent of theeffort by updating their certied products. Only oneof the four sites was able to validate that all patientsthat should be captured in the eCQM results areincluded and that all relevant interventions thatoccur are included in the measure calculations.Despite their efforts, the other three hospitalshave not been able to achieve accurate measure

    calculations from their eCQM reporting tools.

    The ndings from the hospitals and health systemsstudied highlight the following challenges:

    1. Program Design Challenges: The eCQMspecications were complex, difcult to access,contained inaccuracies and were not maintainedover time, creating confusion and additionalwork.

    2. Technology Challenges: The eCQM tools fromvendors did not work as expected and could notefciently generate accurate measure results.

    3. Clinical Challenges: The eCQM implementationprocess negatively affected clinicians, addingto their workload with no perceived benet topatient care as it duplicated information alreadyentered in narrative text. The process alsofailed to generate usable data to support qualityimprovement efforts.

    4. Strategic Challenges: Hospitals expended

    excessive effort on the eCQMs that negativelyaffected other strategic priorities.

    Figure 2

    Figure 2.High level steps in the actual eCQM implementation experience, illustrating the iterativeprocess and challenges encountered within each step.

  • 8/12/2019 13ehrchallenges Report

    14/23

    14

    Program design challenges

    Unclear specifications and lack of policyinfrastructure caused confusion and re-work.

    The eCQM specications were highly complexand contained errors. Vendors and hospitalswere challenged with the content. CMSinfrastructure to support vendors and hospitalswas lacking; all organizations struggled with howto interpret the intent to provide consistent results.

    One organization used the version of themeasures designed for manual chart abstraction(the CQMs rather than the eCQMs) to evaluateclinical workow and implement the measuresinto their EHRs. This is the same process theorganization follows for other measures, includingthose in the hospital IQR program. Citing the

    difculty of nding the eCQM specicationson the CMS website for this approach, thisorganization already had changed workowprocesses for its own implementation of themeasures. Therefore, they tried to validatetheir results with those provided by the eCQMreporting tool, but they did not make any changesto workow or training specically for the eCQMspecications.

    The eCQMs are included in governmentregulation. However, if the eCQMs are updated,

    the EHR certication rules do not requirevendors to include such updates in their certiedmodules.

    The MU Stage 1 eCQM specications weredifcult to nd. Hospitals and vendors had tosearch through the links provided on the CMSwebsite, eventually nding a 429-page technicalnote written from the perspective of a technicalengineer. The note was out of date, and onlyupdated long after the program began. Hospitalsand vendors had to go to another website, the US

    Health Information Knowledge Base (USHIK) toaccess the codes used by the eCQMs.

    Iterative changes in regulatory guidance led toextensive re-work

    To comply with the MU program, each hospitalreceived updates to multiple EHR componentsfrom its vendor, often tied to changes inregulatory guidance or new interpretation ofthe rules. Each updated component had to be

    connected to the main EHR. Each time a newupdate was received, all prior connections had tobe rechecked. For the eCQM reporting tool, anychange to the EHR components could affect theability to pull the required information.

    Although CMS originally had required hospitalsto attest that the eCQM data were accurate, theagency changed policy in October 2011 basedon early hospital experience that the certiedEHRs did not generate accurate data. CMSused sub-regulatory guidance to clarify thatthe eCQM reporting tools were expected toreport data and were not required to validate theaccuracy of the data reported. This addressedthe problem with the eCQMs and the eCQMreporting tools, but undermined the process andthe program. In addition, providers were worried

    that attesting to the accuracy of data that theydid not consider to be correct would create acompliance issue.

    The program required use of new and unfamiliarvocabularies to define required data

    All of the organizations were challenged byrequirements to move from ICD-9-CM to ICD-10-CM for conditions and ICD-10-PCS forprocedures at the same time they are asked toenter clinical information in their problem listsusing SNOMED-CT. For the most part, hospitals

    relied on their eCQM reporting tools to managethe crosswalks between the new vocabularies inthe eCQMs and the terms used locally for clinicaldata. However, one organization purchasedadditional software and services to help withthis crosswalk to the eCQM reporting tool for theMU Stage 1 eCQMs. At least one of the otherorganizations will be purchasing such softwareand services to support MU Stage 2 work. Thissame organization also will change from usingthe certied EHR vendors eCQM reportingmechanism to one that also supports reporting

    of hospital IQR measures and also works closelywith its EHR software. The leadership noteda preference for the information provided bythe current IQR reporting software becauseit has better support for benchmarking withother hospitals regionally and nationally. Thus,the organization believes it will lose valuablemanagement information by switching eCQMreporting tools specically for the MU program.

  • 8/12/2019 13ehrchallenges Report

    15/23

    15

    Technology challenges

    Hospitals experienced significant rework tocreate and revamp clinical workflows to meeteCQM tool requirements

    Some of the information needed was relativelystraightforward, such as diagnoses thatmade the patient eligible for inclusion in thereported measure result. Such diagnoses arefound directly on the discharge sheet and areappropriately coded so the hospital can billfor the hospitalization. Other information isdocumented only in narrative text (a typed orhandwritten note) and not in a structured way(i.e., from a drop down list of predeterminedchoices, each of which is identied by thecomputer with a specic code).

    In order to accommodate the measurerequirements for documentation of specicclinical information in a discrete format, theorganizations had to modify their EHRs to supportdata capture of the measure required elements,often resulting in the addition of structured elds(such as checkboxes and pre-dened lists ofvalues). This represented an additional effort onthe part of the sites in addition to updates to theEHR infrastructure.

    The additional elds, as well as their placement

    within the EHR forms and navigation, createddisconnects from the clinical workow, whichrelied on narrative (free-text) documentation inaddition to structured data and increased timespent on documentation, rather than patient care.

    Each organization reports that its vendor is barelystaying a step ahead of its customers, givenrushed timelines and immature specications.EHR and eCQM reporting tool vendors haveinsufcient infrastructure to support theeCQM requirements, and they are adjustingwith rapidly designed workarounds to meet

    certication requirements that are not reallyusable solutions. EHR usability and the userperspective are sacriced in place of being ableto state functionality exists. There is insufcientopportunity to test and develop usable systemswith careful customer workow analysis and

    develop smarter ways to capture data given thetime constraints of the MU program.

    The EHR could not draw relevant data fromother systems

    Hospitals with departmental systems frommultiple vendors reported the absence or limitedconnectivity and sharing of information amongthe different vendor products (intra-hospitalinteroperability). Such information may beessential to accurate reporting of the eCQMs.All organizations described the tool used tocalculate and report the eCQM as inexible and

    as the most signicant challenge that disruptedworkow:

    An example from one organization is the abilityto retrieve basic information about admissions,discharges and transfers (ADT) from theadministrative system, where the ADT productis from a different vendor than the EHR vendor.All of the organizations reported waitingmonths for resolutions to this type of problem,if one was available at all.

    Another example is the lack of sharedinformation about surgical incision timesfrom the Operating Room (OR) or anesthesiadepartmental systems regardless of vendor.One of the measures included in MU Stage2 eCQMs is derived from the Surgical CareImprovement Project (SCIP) and evaluatesappropriate antibiotic use to prevent surgicalinfection (prophylaxis). The measure requiresthe surgical incision time to compare withthe time the antibiotic is administered to

    Make eCQMs easier to understand. Just

    because vendors are certied doesnt mean

    it really works with the workow.

    -Quality Department Director

    We recognize the sense of urgency, but

    sometimes you need to slow down to go

    fasterthe time is not for the technology

    it is the people aspect.

    -Hospital CEO

  • 8/12/2019 13ehrchallenges Report

    16/23

    16

    the patient. The surgical incision time isusually entered in the OR or the anesthesiadepartmental system. The antibiotic timingmay be documented in the EHR or theanesthesia system.

    Hospitals devoted quality management staff andother clinicians to abstract the information fromother department systems and enter it into theelds in the EHR required to report the eCQMs.The limited number of EHR data elds was cross-walked to the locations in the eCQM reportingtool that were recognized as data sources foreCQM calculation.

    Clinical challenges

    eCQM reporting tools were poorly aligned with

    clinical workflow, necessitating the redesign ofthe patient care systems or the re-tooling of thereporting tools

    How physicians and nurses work to best care fortheir patients can be dependent on local cultureand circumstances. Changes to clinical workowrequire careful analysis and readjustment toassure there is benet to: (a) decrease theoverall work for the physician or nurse; and/or (b)improve the speed of the patient care; and (c)improve the safety and quality of care patientsreceive.

    The eCQM reporting tools were not designed tosupport physician and nurse decision-makingand documentation naturally. Therefore, up totwo-thirds of the necessary information for eCQMreporting had to be specically identied in thedocumentation process (also known as hard-wiring the data elements into the EHR) in orderto capture them for eCQM reporting.

    However, the physicians and nurses did notroutinely use the EHR elds recognized by theeCQM reporting tools in the course of patientcare because the design conicted with theirworkow. In an attempt to capture the neededdata, physician and nursing leaders spent

    considerable time carefully analyzing and makingiterative adjustments to clinical workow so asto not diminish the safety, quality and speed ofpatient care. The time involved ranged from twoto three months to 18 months per measure.

    For example, to capture information required byeCQMs, physicians and nurses documentedjustications for actions taken or avoided in theEHR elds the eCQM reporting tool required.However, the EHRs did not save the justicationas part of information shared for clinical care.Unless the physician also documented the

    rationale for not ordering medication to break upa clot in a narrative note, it was not available forviewing by other clinicians. Therefore, clinicaldocumentation supporting eCQM reporting hadto be duplicated to assure patient safety (i.e.,other clinicians understood clinical justication).The duplication is a signicant efciency issue.

    The workow changes to meet the eCQMreporting tool requirements added to physicianand nursing workload, and provided noperceived benet to patient care.

    Validation efforts were extensive, but notsuccessful

    The organizations noted that although the eCQMreporting tools were technically valid becausethe EHR contained specic elds to store theinformation needed for the eCQMs, they weregenerally not clinically valid, despite theirextensive efforts to change workows to captureneeded data.

    Dont make someone document some-

    thing just for the sake of reporting The

    information should be in the workow of

    health care professionals with value to the

    patients care if it is used in a measure.

    -Hospital Quality Management Director

    The more exercises and hoops we have

    jump through, the less time and effort we

    are spending caring for patients, teachin

    patients and families how to engage and

    participate in their care, getting docs toadopt, etc.

    -Hospital Chief Nursing Ofcer

  • 8/12/2019 13ehrchallenges Report

    17/23

    17

    Organizations suggested three reasons for theconicts with clinical workow that limited theability to achieve accurate measure data:

    The rapid pace of the meaningful use programand the certication process did not give

    vendors sufcient time to perform carefuluser workow analysis before developing thesoftware;

    Software usability is not a criterion for EHRcertication; and

    Some information required for eCQM reportingis entered in EHRs as a narrative text becauseit is efcient to do so within the course ofpatient care; eCQM reporting tools cannotdirectly use narrative text.

    As described above, concurrent review was aneCQM validation process used by three of thehospitals. In these hospitals, designated staffensured that data not automatically entered asstructured data in the EHR were included as suchin EHR screens that the eCQM reporting tool wasdesigned to recognize. As a result, the accuracy ofthe data used for eCQM calculation is dependenton staff review of entries in the EHR and manualdata input.

    Clinical staff did not trust the data to support

    quality improvement

    Organizations either spent considerable timein re-work to revise and validate the eCQMmeasurement process with the eCQM reportingtool, or chose to ignore the results in favorof those derived from the chart-abstractedversions of the measures. Despite their extensivevalidation efforts, only one of the hospitals couldsuccessfully generate clinically valid, accuratedata that it was condent could support qualityimprovement efforts. The fact that some of

    the organizations had corresponding trustedresults from the chart-abstracted versions of themeasures (CQMs) allowed them to compareactual clinical results with the eCQM reports.

    Two organizations were able to provide somelevel of detail regarding the wide differencesbetween the reported rates from the certiedeCQM reporting tools and the actual clinicalcare provided. These discrepancies varied by

    measure. The organizations reported the eCQMsunderestimate actual performance.

    Only one organization reported the results ofeCQMs to physicians but indicated they were awork in progress and not intended for any action.Two other organizations have delayed includingthe eCQM data in the quality reports they provideto physicians and staff until they can successfully

    validate the results are correct. One organizationhas not yet provided physicians access to theinformation. None of the organizations are usingthe eCQM results for quality improvement efforts.

    Rigid regulatory requirements caused theeCQMs to be out-of-date and out of step withadvances in care; updates were available latein the process but were difficult to find andoptional for vendors to incorporate

    The eCQM measure process is new anda standard update process had not been

    implemented prior to the implementation of theMU Stage 1 program. Updates continue to bedelayed and some of the clinical information inthe eCQMs is out of date. The chart-abstractedmeasures from which the eCQMs were derivedare updated twice a year.

    The eCQMs were not updated to reect themost recent developments in clinical care untilthe program was in full force, and then theupdates were out of synch with the respectivechart-abstracted measures. Two of the hospitals

    purchased products from other vendors thatprovide up-to-date information about clinicaltreatments that improve how often and howfast patients get better. These products arekept current as new studies are published andthey address each of the conditions measuredby the eCQMs. The hospitals physicians usethese products to create sets of orders in theirrespective hospitals EHR. Thus, their ordersets are consistent with the latest evidence, and

    The CMS message that reporting was

    more important than accurate results

    undermined physicians faith in qualitymeasurement.

    -Health System Chief Medical Information Ofcer

  • 8/12/2019 13ehrchallenges Report

    18/23

    18

    maintained as new evidence becomes available.Physicians who use up-to-date sets of orders maycause the hospital to have poorer performance asmeasured by compliance with the eCQMs.

    Each hospitals leadership supports physician

    and clinical staff efforts to do the right thingand treat patients based on the most currentknowledge. However, the lack of timeliness inupdating the eCQMs causes confusion amongphysicians. Additionally, results from the eCQMreporting modules can misrepresent the quality of

    care delivered to patients.

    Strategic challenges

    Hospitals saw no return on investment

    None of the organizations specically itemized

    the costs attributable to implementing theeCQM measures, but all indicated spending anunexpectedly large and unsustainable amount ofstaff time on the effort. This included the very realcosts of the nursing and physician time spent onactivities to design the EHR screens and to performextra work to enter data required for the eCQMs.

    Some of the capital costs for the EHRimplementations were expended prior to theinstitution of CMS MU program, but eCQM-specic costs included reassignment ofapproximately three FTE quality analysts foreach of the organizations, with up to 15 FTEsreassigned for Meaningful Use in general.These FTEs were distributed among IT staff,clinical staff assigned to the eCQM effort, andadditional quality staff for the concurrent review

    process (in the three hospitals that performedconcurrent review to enhance electronic qualityreporting). On top of these direct technicalstaff resources, hospitals expended substantialnursing and physician time on activities to designthe EHR screens and to enter data required forthe eCQMs. That human resource effort was notquantied by any of the organizations.

    One organization cited additional capital costsof $122K to purchase an additional certiedeCQM reporting tool. That organization estimated

    an annual expense of approximately $1M to$1.5M per year on MU and the eCQMs. Overthe life of the MU program, the incentives of$10M may cover about 50 percent the specicincremental expenditures for the MU and theeCQM programs, but will not cover the sizable

    historic capital investment. Each of the hospitalsstudied reported similar nancial experiences. Inaddition, one hospital plans to purchase a neweCQM reporting tool and additional software andservices for vocabulary management, which willadd signicant expense.

    Extensive efforts delayed other planned projects

    Hospitals delayed other projects importantto their strategic goals to accommodate MU,as the overall budget for health IT among theorganizations did not change during the MUprogram implementation. The time and costof eCQM implementation was regarded as anintegral part of the MU effort, and not trackedseparately.

    Administrative and clinical leadership alike waschallenged with the resources consumed byeCQM implementation and was concerned aboutthe need to delay other important initiatives,including:

    Strategic initiatives to improve patient safety

    and clinical outcomes.

    Initiatives to improve patient engagement andshared decision-making to improve outcomesand decrease unnecessary resource use.

    Readmission reduction initiatives.

    Initiatives to improve the coordination of carewithin their communities and regions.

    Some specic examples of delayed projectsimportant to hospital strategic goals, include:

    Creating a function known as single sign-onto allow physicians and nurses to sign on tothe hospitals EHR once and have access toall of the information available from differentdepartmental systems (e.g., radiology orscheduling systems) without entering nameand passwords for each of these systems.

    Streamlining EHR nursing documentation to beless cumbersome and more standardized.

    My nurse is nursing the chart now

    instead of nursing the patient.

    -Physician Champion

  • 8/12/2019 13ehrchallenges Report

    19/23

    19

    Automating the documentation of medicationsadministered using bar codes to electronicallyidentify the patient, the medication and assurethe dose and time are correct.

    Completing the use of all EHR components

    by implementing physician documentation,including narrative notes and some structureddata to signicantly reduce or eliminate thepaper record.

    Leadership at all sites expressed concern thatthe eCQMs focus on internal hospital processes

    that generally work well already. The signicantwork effort required to document the informationto support the eCQM reporting is a distraction.All of the sites leaders noted that the transitionof care documents that are the focus of the MUprograms are only minor steps in the process.

    Hospitals developed workarounds to address theeCQM implementation challenges. The followingtable elaborates on these challenges, thendescribes adaptive workarounds hospitals createdto manage them. The table also includes policyrecommendations to address the cited challenges.

    Program Design ChallengesProvider Implications and

    Adaptive Workarounds

    Policy Recommendations

    eCQMs were introduced before robusttesting for validity, accuracy and feasi-

    bility.

    Modifications led to multiple iterations of toolsand associated workflow redesign.

    Measure results were frequently inaccurate.

    Costs to implement were much higher thanexpected.

    Reduce pace of rollout with fewer butmore well-tested measures.

    Specifications were hard to find, lengthy

    and frequently modified to correct errors.

    Hospitals spent excessive time searching for

    correct versions or used specifications forchart-abstract measures.

    These problems contributed to inaccuratemeasure results.

    Provide clear guidance and a

    consistent, reliable process for eCQMdevelopment, updates, availability,access and implementation.

    Meaningful use eCQMs require unfamiliar

    vocabularies for data elements (such asLOINC, SNOMED-CT).

    Hospitals struggled with unfamiliar vocabularies.

    Hospitals relied on eCQM reporting tools tomanage the crosswalks between new vo-

    cabularies in the eCQMs and the terms used

    locally or purchased another vendors service tosupport new vocabularies.

    Hospitals incurred additional costs.

    Hospitals voiced concerns about potential errorsin coding or billing and associated risks of

    subsequent audits.

    Support the development of anaccurate, complete and validatedcrosswalk from SNOMED-CT to

    ICD-10-CM and ICD-10-PCS.

    Provide for adequate training andeducation.

    Sub-regulatory guidance to ignore dataaccuracy conflicts with both hospital

    goals for quality improvement and other

    program policy to report accurate quality

    data.

    Hospitals and clinicians saw no benefit fromgenerating inaccurate data.

    Hospitals were worried that reporting data thatthey did not consider to be accurate wouldcreate a compliance issue.

    Create an eCQM development,testing, and certification program that

    supports accurate measurement.

    Table 1.Summary of ndings, including challenges in eCQM program design, resulting challenges fromtechnology and clinical perspectives and policy recommendations to address challenges, providerimplications, adaptive workarounds and corresponding policy recommendations addressing the cited

    challenges.

  • 8/12/2019 13ehrchallenges Report

    20/23

    20

    Technology ChallengesProvider Implications and

    Adaptive Workarounds

    Policy Recommendations

    EHRs are not designed to capture andenable re-use of information captured

    during the course of care for later eCQM

    reporting.

    Hospital clinical staff enter information multipleplaces in EHRs to ensure data availability foreCQM reporting.

    Staff time devoted to manual re-entry of infor-mation that already exists elsewhere in the EHR

    erases efficiencies gained from the use of EHRs

    and undermines the presumed value of automa-

    tion for quality reporting and improvement.

    Improve heath IT standards for EHRsand eCQM reporting tools to addressusability and data management.

    Improve vendor tools to includeworkflow design flexibility.

    EHRs are not designed to capture infor-mation from other department information

    systems at the level of detail needed for

    eCQM reporting.

    Quality or other staff abstract information fromother department information systems and enterit into the fields in the EHR required to report the

    eCQMs.

    Improve EHRs and reporting tools tosupport intra-hospital interoperability.

    EHR vendors update and separately

    deliver individual EHR components for

    Meaningful Use.

    Hospitals conducted multiple updates anditerative testing.

    Establish a predictable update processand schedule for eCQMs, with easy

    access and notification of updates.

    Require vendors to support the latestupdate on a specified schedule

    Clinical ChallengesProvider Implications and

    Adaptive Workarounds

    Policy Implications

    EHRs and certification requirements are

    not designed to support effective andefficient patient care workflows or draw

    data from them.

    Hospitals modified workflows solely to supportadequate data capture, working iteratively with

    their vendors.

    Ultimately, hospitals substantively altered

    clinical workflow solely to accommodate thedata needed for the eCQMs, with no benefit for

    patient care.

    Give vendors more time to develop

    useful and accurate tools that supportlogical workflows and leverage data

    already in the EHR.

    Hospitals were unable to validate the

    eCQM results.Hospitals either reported the results of eCQMs

    as inaccurate, but a work in progress, or did not

    report the eCQM results directly to physicians

    and nurses.

    Inaccurate results from the eCQM reporting toolcombined with increased workflow require-ments led to clinicians mistrusting the data and

    not using it for care improvement.

    Create an eCQM development, testing,

    and certification program that supports

    accurate measurement.

    Meaningful Use Stage 1 eCQM specifi-

    cations are out-of-date and sometimesinconsistent with current care recommen-dations.

    Physicians who use up-to-date sets of

    orders may cause the hospital to have poorerperformance as measured by the eCQMs.

    Create a mechanism to update eCQMs

    to reflect new state of the art clinicalpractice and to match updates to corre-sponding chart abstracted measures.

  • 8/12/2019 13ehrchallenges Report

    21/23

    21

    Strategic ChallengesProvider Implications and

    Adaptive Workarounds

    Policy Implications

    Time and personnel requirements toimplement eCQMs were far beyond

    expectations and excessive.

    Hospitals added tasks to existing IT and/orquality management staff responsibilities anddelayed projects.

    Clinical staff expended considerable time docu-menting for eCQMs, with no perceived value for

    patient care.

    Excessive staff time spent on eCQMs delayedfocus on other priorities such as reducing read-

    missions, improving patient safety or advancing

    care coordination.

    Consider the effort required in futurepolicy for eCQMs.

    Combination of time and effort involved

    and inability to validate results meanthospitals saw no return on investment.

    Results damaged credibility of hospital leader-ship and Meaningful Use program as a whole.

    Reduce pace of rollout with fewer butmore well-tested measures that canbe generated by tools that support

    logical workflows and leverage data

    already in EHRs.

    Recommendations

    Based on the experience of these four advancedand committed hospitals and health system sites,we make the following recommendations:

    1. Slow the pace of the transition to electronicquality reporting with fewer but better-testedmeasures, starting with Stage 2.

    The additional time would allow:

    Policymakers to create a reliable policy processfor eCQM implementation, a mechanism toprovide eCQM updates, and a robust EHRtesting/certication program;

    Vendors to develop tools that support logicalworkows, produce accurate measures andleverage all data already in the EHR; and

    Hospitals to implement the tools in a way thatsupports their quality goals without excessiveburden or risk to patients.

    HHS should clearly articulate a long-term programvision for the development, testing and use ofaccurate and reliable eCQMs and articulatea stable program roadmap. The timeline forimplementing the eCQMs is unrealistic, as itemphasizes inclusion of regulatory requirementsin advance of adequate development, vetting andtesting of eCQM specications for feasibility andclinical validity.

    2. Make EHRs and eCQM reporting tools moreflexible so that data capture can be alignedwith workflow and interoperable so that datacan be shared across hospital departmentsystems.

    Encourage the ability of EHRs to reuse captureddata so that EHRs can support exibility in workowdesign and also reduce the need for to createmultiple places in the EHR to repeat the same data.

    3.Improve health IT standards for EHRs andeCQM reporting tools to address usability anddata management to achieve MU programexpectations.

    The crux of the issues identied by implementationof the eCQMs is the need to add data elds inthe EHR to capture the needed information. Thishard wiring approach is a tactical response toquickly address specic measurement needs, butit has a risk of perpetuating the current process ofimplementing each measure directly in the EHR. Allof the organizations noted that their current processprimarily hard wires measure data into the EHR forMU Stage 1 eCQMs and also for those included inthe 2014 edition of eCQMs.

    A standard set of terms for data used by eCQMscan be cross-walked by vendors to specicsections of their respective EHRs to handle anynew data request as a query to existing data in theEHR. Such a set of terms helps eCQM developersdescribe what they need and lets EHRs nd it

  • 8/12/2019 13ehrchallenges Report

    22/23

    2222

    without hard wiring each measure. The appendixelaborates on these technical and data issues.

    Current EHR certication does not address EHRusability which makes them inefcient and counter-productive for providers. Usability standards should

    be applied to the certied EHR.

    CMS should be the source that provides theeducation and all answers regarding the eCQMsin their programs. The Quality Net website washelpful for questions, but a real time help desk isneeded to address obstacles.

    4. Carefully test eCQMs for reliability and validitybefore adopting them in national programs.Implement eCQMs within hospitals as part oftesting to ensure information ow is accurate andthere is no adverse impact on quality and patient

    safety.

    5. Provide clear guidance and tested toolsto support successful hospital transitionto increased electronic quality reportingrequirements. For example, develop anddisseminate an accurate, complete and validatedcrosswalk from SNOMED-CT, a vocabularyrequired by the 2014 eCQMs and MU Stage 2,to ICD-10-CM for conditions and ICD-10-PCS forprocedures.

    The effort to move to new vocabulary standardsis very challenging. Although some of the dataelements in the eCQMs use only SNOMED-CT(e.g., to dene actions that are not directly billable),it is not currently in active use by clinicians asthey directly care for patients. SNOMED-CT addsa vocabulary rather than replacing an existingvocabulary as with ICD-9 to ICD-10.

    To be successful, crosswalks are essential forSNOMED-CT to ICD-10-CM and ICD-10-PCS andthey must be tested to show they are valid. Also,all providers will need a method to manage thecrosswalk from the SNOMED-CT terms providedwith the eCQMs.

    The background and technical issues regardingeCQMs and health IT standards programs arediscussed in the Appendix.

    Conclusion

    This study demonstrates that future policyrequirements for eCQMs must consider the effortrequired and the capability of reporting speciceCQMs from certied EHRs. Specically, the

    challenges identied in this study must be used toinform future plans and efforts to ensure eCQMsprovide meaningful performance results that lead toimproved patient outcomes.

    Many hospitals, including the four facilities studiedfor this report, have shown a commitment to usingtheir EHRs and eCQMs as a method to automatethe quality measurement process. Even the mostadvanced hospitals have been unable to use theirEHRs to efciently implement the eCQMs andachieve accurate results. Hospitals have expendedlarge amounts of nancial and personnel resources

    in their efforts to make the process work.

    Champions of EHR adoption within the hospitalshave had to work closely with their physician andnursing staffs to maintain their engagement in theuse of EHRs and the credibility of the organizationsquality programs. They have been signicantlychallenged by Meaningful Use Program eCQMsthat are complex, inaccurate, outdated and thatrequire incredible detail to be documented (oftenin duplicative ways) in a structured form in the EHRwith no perceived additional value to patient care.

    The standards for eCQM development and usageare new and evolving and need to be tested andimproved to resolve the barriers and challengesexperienced by providers in implementing theMeaningful Use Stage 1 eCQMs. The use ofeCQMs derived from existing measures originallydesigned for data collection by chart abstractionhighlights the challenges in seeking identicalinformation from different data sources (e.g.,structured EHR elds rather than narrative notes).The eCQM reported measure rates are substantially

    different than the chart-abstracted measure ratesand the results cannot be compared. The currentprocess is not sustainable. Feasibility, reliability,validity and usability must be addressed throughoutthe eCQM development and testing process at thedata element level to truly address clinical workow

  • 8/12/2019 13ehrchallenges Report

    23/23

    challenges such as those experienced by thefacilities examined for this report. Unless eCQMsare more robustly tested before incorporation inwidespread national programs such as MeaningfulUse, the reality is that hospitals will not be able tomeet the new standards and certied EHRs will not

    deliver on the promise of eCQMs to improve patientcare, provide comparative quality informationto consumers and reduce the burden of qualityreporting.

    References1Center for Medicare and Medicaid Services. Value-BasedPurchasing. May 6, 2011. Available at:www.gpo.gov/fdsys/pkg/FR-2011-05-06/pdf/2011-10568.pd

    2VanLare JM, Conway PH. Value-Based Purchasing ---National Programs Move from Volume to Value, N Engl J Me2012; 367:292-295. July 26, 2012.

    3Information is available at: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/hospital-value-based-purchasing/index.html?redirect=/hospital-value-based-purchasing/

    4Mekhijian HS, Kumar RR, Kuehn L, t. al. Immediate benerealized following implementation of physician order entry atacademic medical center, JAMIA. 2002; 9:529-539.

    5Potts AL, Barr FE, Gregory DF, et. al. Computerized physicorder entry and medication errors on a pediatric critical careunit. Pediatrics. 2004: 113:59-63.

    6HITECH Act, title XIII of division A and title IV of division B othe Recovery Act. Pub. L. No. 111-5, div. A, tit. XIII, 123 Stat115, 226-279 and div. B, tit. IV, 123 Stat. 115, 467-496 (2009

    7National Quality Forum, Quality Data Model. Information isavailable at: http://www.qualityforum.org/QualityDataModel.aspx.

    8National Institutes of Health. National Library of MedicineValue Set Authority Center: Data Element Catalogue. Availabat:http://www.nlm.nih.gov/healthit/dec/.

    9Source: CMS Data and Program Reports Recipients ofMedicare EHR Incentive Program Payments; accessed Marc

    5 2013.