Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare...

63
Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director Evaluation and Innovation

Transcript of Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare...

Page 1: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

Getting to Scale: Spread

IA Graduate Seminar, May 18, 2010Lisa Schilling RN MPH VP, Healthcare Performance Improvement

Jim Bellows, PhD Senior Director Evaluation and Innovation

Page 2: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

2

Objectives for today

• Discuss models and thinking about what “spread” means and considerations for effective application

• Consider how to apply models in your area

• Access tools to help local sites assess readiness to spread and adopt practices

Page 3: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

3

What you have already learned

Spread a change to other locations

Develop a change

Implement a change

Test a change

Act Plan

Study DoTheory & Prediction

Test under a variety of conditions

Make part of routine operations

Don’t go from here …

to here!

The Sequence for Improvement

Source: Bob Lloyd, IHI 2009

Page 4: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

4

More you already learned…

A x Q = E

A= strategies to build acceptance and commitment (culture, accountability)

Q= quality of technical solution (both the change and the reliable application of change)

Source: Jack Welch

Page 5: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

5

Influencers of Implementation and Spread

Will Values Alignment/

prioritization Relationships Communication Goals /measures

Ideas Change package Effective practices learning

Execution• Infrastructure and

resources• Method• Monitoring/feedback

Source: IHI 2009

Page 6: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

6

Conceptual Models for Spread

Psychological: Diffusion Transtheoretical: readiness for change

Infrastructure: Breakthrough Series Collaborative model IHI Framework for Spread Campaign model Multiplicative spread

Other: Hybrid models

Page 7: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

7

Many good recipes…

Ownership

Uniformity

Reliability

Sustainability

(Cycles of Scrutiny)

4WD

Compelling Need to Move Destination

3 H’s

What Gets Us There

Leadership alignment

Standardization / Systemization

Project Management

Data that drives

Leadership alignment

Standardization / Systemization

Project Management

Data that drives

OURS

Page 8: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

8

• Relative Advantage

• Compatibility

• Complexity

• Trialability

• Observability

Elements Important for the Rate of Adoption

Source: Everett Rogers

Page 9: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

9 Source: Institute for Healthcare Improvement, 2006.

Mental Model for Spread

Page 10: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

10

Applying this in Operations

Innovate

Test and Replicate-WavesJust Do It

Test and Replicate -Collaboratives

Test and Replicate-Diffusion

Org

aniz

atio

nal

A

lig

nm

ent

High

TransferabilityHigh Low

Low

Source: Stacey 2002

Page 11: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

11

Definitions

Just Do It

• Use project management to implement

• Go fast; replicate with little variation

Test and Replicate: Diffusion

• Implement in a few sites to increase level of agreement among stakeholders

• Encourage spread, go slow, minimal highly coordinated or centralized effort with

Test and Replicate: Collaboratives

• Use IHI’s Breakthrough Series Collaborative model

• Focused infrastructure, accountability, learning and sharing to create change package

Test and Replicate: Waves

• Pilot in 1-3 sites first, then spread to 5-10 sites, then to all the rest of the sites

• Drive spread, highly coordinated and planned progression of spread, testing especially in first two sites to implement practice to build will and transferability of practice

Innovate

• Use innovation methods such as IDEO

• Go slow, prototype, replicate, refine and spread

• High failure rate to get practice

Page 12: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

12

More Tools to Apply in Operations

Readiness to Spread and Receive

Methods for Monitoring Spread

Supporting a Learning Culture

Org

aniz

atio

nal

Ali

gn

men

t

High

TransferabilityHigh Low

Low

Source: Stacey 2002

3

Innovate

Test and Replicate-Waves

Just Do It

Test and Replicate -Collaboratives

Test and Replicate-Diffusion

1 2

4

Page 13: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

13

A Tool to Lead Spread in 9 Steps

Determine organizational readiness for spread1. Start with the end in mind2. Determine whether linked to strategic objectives of

organization3. Assess readiness to spread (using tool)4. Assess readiness to receive (using tool)

Develop a plan:5. Choose spread approach. 6. Develop a plan for spread

Execute on the plan:7. Prepare for testing and implementation8. Gather information over time to allow adjustment of

spread plan9. Identify sites in need of support

Page 14: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

14

Spread Tool (steps 1-4) Determine organizational readiness for spread

Ste

pH

ow

2. Link to strategic objectives

3. Assess readiness to spread

Determine whether linked to strategic goal, align incentives

Craft a compelling message and cascade

Charter team

Complete readiness to spread assessment with team

Plan for sites based on learning

Revisit scale, scope and speed

4. Assess readiness to receive

Complete readiness to receive assessment with team

Plan for sequencing based on learning

Create monitoring and review plan

1. Start with the end in mind

Determine what is being spread

Define target population and end state

Establish timeframe to achieve scale

Identify system level metrics and outcome

Define “sites” participating in effort

Page 15: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

15

Spread Tool (step 5-6)Develop a plan

Ste

pH

ow

5. Choose spread approach

Use results from steps 3 and 4 to determine alignment/ transferability

Choose spread approach

Plan resources

Create full description of change package

Create a measurement plan including impact on system performance

Plan to monitor extent of spread both the change package and scale achieved

6. Develop a plan for spread

Plan infrastructure and resources -elements to scale, new role requirements, technology

Identify experts who will teach others re practice

Determine physical and relationship linkages/proximity

Page 16: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

16

Spread Tool (steps 7-9)Execute on the Plan

Ste

pH

ow

7. Feedback to adopters

Implement practices to share learning and progress

Monitor rate of adoption and determine adjustments needed:

-messages

– Capable messengers

-Transition issues

8. Gather info and adjust plan

Manager support

Sufficient time to test and implement

Adopters understand methods

Technical support

9. Identify site in need of support

Ensure middle management (or process owners) engaged throughout

Determine sustainability metrics; thresholds that trigger specified remedial actions

Plan content, technical and implementation support

Page 17: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

Tools to Plan and Lead Spread

Jim Bellows

Page 18: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

18

Topics

• Specify your goal(s) in spreading a successful practice Be clear about your role

• Assess practice readiness for export

• Assess site readiness to import

Page 19: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

19

What is your spread goal? Spread what? From where to where?

State your Project Goal here. Remember your goal should be S.M.A.R.T. (Specific, Measurable, Agreed Upon, Realistic, Time-based)

Objectives

List measures to support the Project Goal and Objectives.

Outcome Measure(s):

Process Measure(s):

Page 20: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

20

Typical spread goals

• Bring <practice> to our medical center from <Region>

• Help other medical centers adopt our successful practice

• Get all the units in our medical center adopt <practice> that has been so successful in <pilot unit>

• Bring <practice> from <Region> to all the units in our medical center, beginning with <demo unit>

• Program Office says we all need to do <practice>, so let’s do it

Page 21: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

21

Your spread goal defines you role in supporting spread

Typical goal Pattern Role

Bring <practice> to our medical center from <Region> External1 Importer

Help other medical centers adopt our successful practice 1External Exporter

Get all the units in our medical center adopt <practice> that has been so successful in <pilot unit>

1Many Distributor

Bring <practice> from <Region> to all the units in our medical center, beginning with <demo unit>

External1Many

Importer-Distributor

Page 22: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

22

Tasks will depend on your role in spread

ImporterExternal1

Confirm practice readiness for export

Assess your site readiness for import

Choose an import model

Import!

Exporter1External

Confirm practice readiness for export

Market the practice; find a distributor

Assess import site readiness for import

Choose an import model

Export!

Distributor1Many

Confirm practice readiness for export

Assess alignment and readiness across all sites

Choose a distribution model

Distribute!

Importer-DistributorExternal1Many

Confirm practice readiness for export

Assess alignment and readiness across all sites

Choose a model for import and dist’n

Choose a demonstration site

Import! (and evaluate)

Distribute!

Page 23: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

Practice Readiness-for-Export Assessment

Page 24: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

24

Why assess Readiness for Export?

Have you ever…

… tried to import a practice that was successful for the innovator, but you just couldn’t make it work?

… tried unsuccessfully to interest others in a practice that seemed great to you?

… had a senior leader ask you to import/distribute a practice that was: Too complicated? Expensive, with little return? Not the best way to get the job done?

Solution? Due diligence – don’t conclude too quickly that a practice is ready for export

Page 25: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

25

Readiness for Export Assessment

KP Readiness-for-Spread Assessment About This Tool The purpose of this tool is to help KP succeed in spreading successful practices widely. One key factor is picking the ripest opportunities – some practices aren’t really ready to be spread widely. This tool can help program champions and KP leadership understand whether a promising practice is ripe for successful spread across KP. Using it can prevent wasting energy from trying to spread a practice that has not yet been developed sufficiently. The tool can highlight the aspects of a practice or its documentation that might need to be strengthened to support wide scale spread. It is meant as a discussion tool to support informed decision making and to help set realistic expectations. It is not intended to create “hoops to jump through,” or to interfere with spread efforts that enjoy strong support.

Who To Involve in the Assessment Process The assessment can be used in two distinct settings, described below along with ideal participants

1. Push – Program champions can use the tool to address the question: “Could my program or practice be spread widely from its current demonstration site(s)? Facilitator: A KP Improvement Advisor or other person, not directly responsible for the program, who is knowledgeable about practice transfer. Participants: Program champion, implementation lead, front-line staff.

2. Pull – Senior leaders can use the tool to address the question: “Is this program or practice ripe for transfer into my area or Programwide?” Facilitator: Leader or staff of a Program Office or Regional unit responsible for supporting spread of successful practices. Participants: Program champion, implementation lead, Improvement Advisor, and two or more “peer reviewers” who can provide an independent perspective.

Instructions 1. Scan through the four main sections to get an overview of main areas for assessment. 2. The rows within each section present key elements of readiness for successful spread. For each element,

simple statements illustrate different levels of readiness, from Start-Up to Well Established. 3. For each row:

First each participant rates the practice on their own. Circle all the statements that describe the practice. Be realistic – assess the practice as it is, not how you hope it will be. Use judgment in deciding which statements to circle – do your best to capture the spirit of the assessment, not details of the wording.

Then the facilitator leads a brief discussion to produce a “sense of the group.” Record the consensus on a master copy of the assessment tool. Don’t get hung up on unanimity. It’s OK to record a range of responses.

4. For each section: First each participant assigns an Overall score on their own, using the 1-10 scale. Circle the score. Use

judgment, considering all the elements in the section. The Overall score needn’t be an average of scores representing each element. In some cases it might make sense for the Overall score to be based on the lowest score for any element.

Then the facilitator leads a brief discussion to produce a “sense of the group.” Circle the consensus score on a master copy of the assessment tool. If some participants dissent from the consensus, note the range of outliers.

5. When scores are completed for all four sections, go to the Scoring and Summary page and follow the instructions. The Scoring and Summary also includes simple recommendations about where to focus energy in strengthening readiness for spread.

Facilitator – Please complete the following information on the master copy.

Facilitator (name, position)

Date ________/________/________

Participants (name, position) 1. 2.

Practice Assessed (title or description)

3. 4. 5.

Regional/Medical Center affiliations

6. For more information about this tool or to provide feedback on the tool, please contact either: Jim. [email protected] – Senior Director, Center for Evaluation and Innovation, Care Management Institute [email protected] – VP for Health Care Performance Improvement and Execution Strategy We welcome feedback and suggestions!

1. Impact on Primary Objective

The first criteria for a promising practice relate to impact on the primary objective addressed. What is the one primary objective of the practice or intervention assessed? Patient Safety Physician/Staff Work Experience Effectiveness of Care Equity Patient Experience Efficiency What is the primary measure of impact? __________________________________________________

Element Start-Up Well-Established

Magnitude Number of potentially affected members is unknown or is less than 0.1% of total membership

No impact has yet been observed, or relative impact is less than 5%

Potentially affects 0.1%-1% of members

Relative impact on primary performance measure(s) is 5-10% (e.g. improvement rom 40% to 43%)

Potentially affects 1-10% of members

Relative impact on primary performance measure(s) is 11-20% (e.g. improvement from 40% to 46%)

Potentially affects all members, or a subpopulation of 10% or more (i.e. all older adults, all members with cardiovascular disease, all members with an inpatient stay or surgical procedure, etc.)

Relative impact on primary performance measure(s) is more than 20% (e.g. improvement from 40% to 50%)

Confidence Impact has not been assessed

Compelling anecdotal information

OR… Measured improvement in processes or factors of interest, but measurement is less than robust (e.g. possible confounding, no trending, no comparison group)

Robustly measured improvement in processes or factors that are plausibly related to downstream outcomes, but casual relationship has not been well established (e.g. process reliability, improved follow-up after discharge, increased use of KP.org)

OR… Measured improvement in downstream outcomes or well-established risk factors, but measurement is less than robust (e.g. possible confounding, no trending, no comparison group)

Robustly measured improvement in real, “downstream” outcomes:

Downstream outcomes: Fewer never events, reduced complications of chronic disease, improved satisfaction, etc.

Robusttly measured: Trended annotated run charts show significant improvement OR pre/post analysis with comparison group

OR… Robustly measured improvement in risk factors that have a clear, strongly established, causal relationship to downstream outcomes (e.g. improved hand hygiene, greater use of medications that reduce heart attack risk, reduced waiting times)

Improvement has been robustly measured in more than 1 site and has been sustained over time.

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

2. Impact on Other Aspects of Care

Successful spread of promising practices is affected not only by their impact on the primary objective but also by intended or unintended impact on other aspects of care delivery.

Do not rate here the impact on primary objective rated in Section 1.

Element Start-Up Well-Established

Patient Safety (consider factors including process reliability and safety culture)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Effectiveness of Care (consider factors including delivery of evidence-based care and addressing patient needs)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Patient Experience (consider factors including service, clinician-patient relationships, and personalization)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Physician/Staff Work Experience (consider factors including simplicity and fit with existing processes)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Equity (consider equity across groups defined by health literacy, gender, race/ethnicity, and/or sexual orientation)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

3. Business Case

Promising practices are unlikely to spread without a clear understanding of their business case. All other factors being equal, practices with positive business cases are more likely to spread successfully. A positive business case means a positive return on investment – not only that the financial benefits (cost savings, cost avoidance, or revenue enhancement) exceed the costs, but that the benefits accrue to entity that bears the costs, the benefits are as certain as the costs, the benefits develop in a reasonable time frame, and the potential benefits can be harvested into real “hard green” dollars (e.g. reduced admissions translated into decreased hospital costs per member).

Element Start-Up Well-Established

Costs (operating costs and start-up costs)

Substantial operating cost would require significant reallocation of resources

Modest operating costs can be covered within existing operations budgets, but start-up would require investment from other sources

Modest operating and start-up costs can be covered within existing operations budgets

No costs for implementation – changes work of existing staff rather than adding staff

No startup investment

Savings (cost reduction or cost avoidance)

No savings anticipated Modest savings are projected but has not been demonstrated

Substantial savings have been projected but not documented

Measurement of savings is less than robust (e.g. possible confounding, no trending, no comparison group, etc.)

Substantial savings have been documented

Robust measurement of savings, e.g. trended annotated run charts show significant improvement OR pre/post analysis with comparison group

Revenue (increased total revenue or revenue per member)

No revenue enhancement anticipated

Modest revenue enhancement is projected but has not been demonstrated

Substantial revenue enhancement is projected but not yet documented

Data on revenue enhancement is not robust (e.g. possible confounding, no trending, no comparison group, etc.)

Substantial revenue enhancement has been documented

Robust measurement of revenue enhancement, e.g. trended run charts OR pre/post analysis with comparison group

Return on Investment

Financial costs exceed financial benefits (the practice may still be justified on the basis of other benefits, e.g. compliance)

Financial costs are roughly equal to financial benefits

Financial benefits substantially exceed costs, but transfers would be needed to return the benefits to the entity that bore the costs

Financial benefits substantially exceed costs, and accrue to the entity that bears the costs

Certainty and Timing

Costs are certain but benefits are less certain

Benefits have been demonstrated as robustly as costs, but will accrue 3 or more years later

Benefits have been demonstrated as robustly as costs, but will accrue 1-2 years later

Benefits have been demonstrated as robustly as costs and will occur during the same budget year

Harvestability Harvesting potential benefits could require painful measures, such as closing facilities or eliminating positions

Translating potential benefits into real dollars would require no more than routine management efficiencies

Benefits could translate directly into real dollars, but other actors could undermine (i.e. contract hospitals could raise prices if KP utilization decreases)

Benefits would translate directly into real dollars (e.g. reduced drug costs)

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

4. Transferability

Research in and beyond health care has shown that promising practices are most likely to spread if they can be readily observed in a demonstration site at then piloted locally, are simple, can be adapted to local needs, fit with existing work culture and norms, and align well with leadership goals and strategies. Practices are more likely to spread further as/if they mature – being adopted and sustained by multiple sites and attaining reliable implementation among earlier adopters. Support structures and tools help accelerate transfer from site to site.

Element Start-Up Well-Established

Observability No pilot sites are available to observe OR benefits are not readily observable

Processes and benefits can be observed by potential adopters at 1 pilot site

Processes and benefits are readily observable at 2-4 pilot sites

Processes and benefits can be readily observed at scale in 2+ KP Regions

Simplicity Requires participation by 4+ units or functions (e.g. primary care, ER, and laboratory)

Requires participation by 2-3 units or functions; interactions must be negotiated and tested

Requires participation by 2-3 units or functions, but handoffs and accountabilities are clear and simple

Can be implemented within a single organizational unit and without broader modification of current delivery system

Adaptability Adaptations have resulted in failure to achieve results anticipated

Adaptation has occurred over time at 1 pilot site without compromising results

Adaptation has occurred, without compromising results, at 2+ diverse sites that adopted the practice

Key components are known and simple; the range of acceptable variation has been identified and communicated

Cultural Fit Implementation requires changing significant aspects of work culture and roles

Implementation requires some adjustment of work culture or roles, but no fundamental changes

Fits smoothly with existing work culture and norms

Fits smoothly with existing work cultures, and goes beyond to fit with staff hopes and desires

Goal Alignment Not clearly aligned with KP goals/strategies at national or local level

Directly supportive of lower-tier but not top-tier KP goals/strategies

Cascading Program/Regional/local alignment is missing or weak

Arguably aligned with top-tier KP goals/strategies, but impact is less than direct and substantial

Direct, measurable, substantial impact on one of KP’s top 10 goals/strategies

Leadership has provided an unambiguous message that the status quo is unacceptable, with clear Program/Regional/local alignment

Sustainability Not yet sustained for 6+ months at any KP site

Reliability and performance data are not available

Implementation sustained 6-12 months at 1+ KP sites

Performance is measured, but no control charts show reliability

Performance has been sustained for 1+ year at one site

80-90% reliability has been documented in control chart(s)

Data demonstrates sustained performance for 1+ year

95%+ reliability has been documented in control chart(s), with balancing measure(s)

Implementation Support

No change package is available

Pilot site champions are not readily available for consultation

No comprehensive change package, but sample tools and resources are shared

Pilot site champion(s) are available for consultation by phone

Change package is available, with tools, metrics, case studies, etc.

Pilot site champion(s) are available for on-site troubleshooting

IT tools are built but not transferable

Active knowledge management supports ongoing improvement; tacit knowledge transfer is underway among adopters

Decision support and work flow tools are available in KPHC or other systems

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

Page 26: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

26

Readiness for Export covers four areas

1. Impact on Primary Objective

• Magnitude

• Confidence

2. Impact on Other Aspects of Care

• Patient Safety

• Effectiveness of Care

• Patient Experience

• Physician/Staff Work Experience

• Equity

3. Business Case

• Costs

• Savings

• Revenue

• Return on Investment

• Certainty and Timing

• Harvestability

4. Transferability

• Observability

• Simplicity

• Adaptability

• Cultural Fit

• Goal Alignment

• Sustainability

• Implementation Support

Page 27: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

27

1. Impact on Primary Objective

Element Start-Up Well-Established

Magnitude No impact has yet been observed, or <5%

Impact on primary performance measure(s) is 5-10%

Relative impact on primary performance metric(s) is 11-20%

Relative impact on primary performance measure(s) is more than 20%

Confidence Impact has not been assessed

Compelling anecdotesOR…Weakly measured improvement in processes

Robustly measured improvement in processes

Robustly measured improvement in real, “downstream” outcomes (e.g. fewer never events, improved satisfaction, etc.)

Overall(based on all above)

1 2 3 4 5 6 7 8 9 10

Using the Readiness for Export tool – Section 1

Page 28: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

28

2. Impact on Other Aspects of Care

Element Start-Up Well-Established

Patient Experience

Potential impact has not been assessed

Potential issues have been identified and mitigated

Risks have been assessed by SME and are believed to be negligible

Data demonstrate positive impact or no adverse impact

Physician/Staff Work Experience

Potential impact has not been assessed

Potential issues have been identified and mitigated

Risks have been assessed by SME and are believed to be negligible

Data demonstrate positive impact or no adverse impact

Overall(based on all above)

1 2 3 4 5 6 7 8 9 10

Using the Readiness for Export tool – Section 2

Page 29: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

29

3. Business Case

Element Start-Up Well-Established

Savings No savings anticipated

Modest savings are projected but not demonstrated

Substantial savings have been projected but not documented

Substantial savings have been documented

Certainty and Timing

Costs are certain but benefits are less certain

Benefits have been demonstrated as robustly, but will accrue 3+ years later

Benefits have been demonstrated as robustly, but will accrue 1-2 years later

Benefits have been demonstrated as robustly and will occur during the same budget year

Harvest-ability

Harvesting potential benefits could require painful measures

Translating benefits into real dollars would require only routine efficiencies

Benefits could translate directly into real dollars, but might not (i.e. contract hospitals could raise prices)

Benefits would translate directly into real dollars (e.g. reduced drug costs)

Using the Readiness for Export tool – Section 3

Page 30: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

30

4. Transferability

Element Start-Up Well-Established

Simplicity Requires participation by 4+ units or functions

Requires participation by 2-3 units; interactions must be tested

Requires participation by 2-3 units; accountabilities are clear and simple

Requres no modification of current delivery system

Adaptability Adaptations have resulted in failure

Adaptation has occurred at 1 pilot site with good results

Adaptation has occurred, without compromising results, at 2+ diverse sites that adopted the practice

Key components are known; acceptable variation is known

Cultural Fit Requires significant changes in work culture and roles

Implementation requires some adjustment, but no fundamental changes

Fits smoothly with existing work culture and norms

Fits smoothly with staff hopes and desires

Using the Readiness for Export tool – Section 4

Page 31: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

31

Readiness for Export Assessment

KP Readiness-for-Spread Assessment About This Tool The purpose of this tool is to help KP succeed in spreading successful practices widely. One key factor is picking the ripest opportunities – some practices aren’t really ready to be spread widely. This tool can help program champions and KP leadership understand whether a promising practice is ripe for successful spread across KP. Using it can prevent wasting energy from trying to spread a practice that has not yet been developed sufficiently. The tool can highlight the aspects of a practice or its documentation that might need to be strengthened to support wide scale spread. It is meant as a discussion tool to support informed decision making and to help set realistic expectations. It is not intended to create “hoops to jump through,” or to interfere with spread efforts that enjoy strong support.

Who To Involve in the Assessment Process The assessment can be used in two distinct settings, described below along with ideal participants

1. Push – Program champions can use the tool to address the question: “Could my program or practice be spread widely from its current demonstration site(s)? Facilitator: A KP Improvement Advisor or other person, not directly responsible for the program, who is knowledgeable about practice transfer. Participants: Program champion, implementation lead, front-line staff.

2. Pull – Senior leaders can use the tool to address the question: “Is this program or practice ripe for transfer into my area or Programwide?” Facilitator: Leader or staff of a Program Office or Regional unit responsible for supporting spread of successful practices. Participants: Program champion, implementation lead, Improvement Advisor, and two or more “peer reviewers” who can provide an independent perspective.

Instructions 1. Scan through the four main sections to get an overview of main areas for assessment. 2. The rows within each section present key elements of readiness for successful spread. For each element,

simple statements illustrate different levels of readiness, from Start-Up to Well Established. 3. For each row:

First each participant rates the practice on their own. Circle all the statements that describe the practice. Be realistic – assess the practice as it is, not how you hope it will be. Use judgment in deciding which statements to circle – do your best to capture the spirit of the assessment, not details of the wording.

Then the facilitator leads a brief discussion to produce a “sense of the group.” Record the consensus on a master copy of the assessment tool. Don’t get hung up on unanimity. It’s OK to record a range of responses.

4. For each section: First each participant assigns an Overall score on their own, using the 1-10 scale. Circle the score. Use

judgment, considering all the elements in the section. The Overall score needn’t be an average of scores representing each element. In some cases it might make sense for the Overall score to be based on the lowest score for any element.

Then the facilitator leads a brief discussion to produce a “sense of the group.” Circle the consensus score on a master copy of the assessment tool. If some participants dissent from the consensus, note the range of outliers.

5. When scores are completed for all four sections, go to the Scoring and Summary page and follow the instructions. The Scoring and Summary also includes simple recommendations about where to focus energy in strengthening readiness for spread.

Facilitator – Please complete the following information on the master copy.

Facilitator (name, position)

Date ________/________/________

Participants (name, position) 1. 2.

Practice Assessed (title or description)

3. 4. 5.

Regional/Medical Center affiliations

6. For more information about this tool or to provide feedback on the tool, please contact either: Jim. [email protected] – Senior Director, Center for Evaluation and Innovation, Care Management Institute [email protected] – VP for Health Care Performance Improvement and Execution Strategy We welcome feedback and suggestions!

1. Impact on Primary Objective

The first criteria for a promising practice relate to impact on the primary objective addressed. What is the one primary objective of the practice or intervention assessed? Patient Safety Physician/Staff Work Experience Effectiveness of Care Equity Patient Experience Efficiency What is the primary measure of impact? __________________________________________________

Element Start-Up Well-Established

Magnitude Number of potentially affected members is unknown or is less than 0.1% of total membership

No impact has yet been observed, or relative impact is less than 5%

Potentially affects 0.1%-1% of members

Relative impact on primary performance measure(s) is 5-10% (e.g. improvement rom 40% to 43%)

Potentially affects 1-10% of members

Relative impact on primary performance measure(s) is 11-20% (e.g. improvement from 40% to 46%)

Potentially affects all members, or a subpopulation of 10% or more (i.e. all older adults, all members with cardiovascular disease, all members with an inpatient stay or surgical procedure, etc.)

Relative impact on primary performance measure(s) is more than 20% (e.g. improvement from 40% to 50%)

Confidence Impact has not been assessed

Compelling anecdotal information

OR… Measured improvement in processes or factors of interest, but measurement is less than robust (e.g. possible confounding, no trending, no comparison group)

Robustly measured improvement in processes or factors that are plausibly related to downstream outcomes, but casual relationship has not been well established (e.g. process reliability, improved follow-up after discharge, increased use of KP.org)

OR… Measured improvement in downstream outcomes or well-established risk factors, but measurement is less than robust (e.g. possible confounding, no trending, no comparison group)

Robustly measured improvement in real, “downstream” outcomes:

Downstream outcomes: Fewer never events, reduced complications of chronic disease, improved satisfaction, etc.

Robusttly measured: Trended annotated run charts show significant improvement OR pre/post analysis with comparison group

OR… Robustly measured improvement in risk factors that have a clear, strongly established, causal relationship to downstream outcomes (e.g. improved hand hygiene, greater use of medications that reduce heart attack risk, reduced waiting times)

Improvement has been robustly measured in more than 1 site and has been sustained over time.

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

2. Impact on Other Aspects of Care

Successful spread of promising practices is affected not only by their impact on the primary objective but also by intended or unintended impact on other aspects of care delivery.

Do not rate here the impact on primary objective rated in Section 1.

Element Start-Up Well-Established

Patient Safety (consider factors including process reliability and safety culture)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Effectiveness of Care (consider factors including delivery of evidence-based care and addressing patient needs)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Patient Experience (consider factors including service, clinician-patient relationships, and personalization)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Physician/Staff Work Experience (consider factors including simplicity and fit with existing processes)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Equity (consider equity across groups defined by health literacy, gender, race/ethnicity, and/or sexual orientation)

Potential for adverse impact has not been assessed

OR… Potential issues have been identified but not addressed

Potential issues have been identified and mitigation measures have been implemented

Risks have been assessed by one or more subject matter experts (SMEs) and are believed to be absent or negligible

SME: _____________________

Data demonstrate positive impact or no adverse impact

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

3. Business Case

Promising practices are unlikely to spread without a clear understanding of their business case. All other factors being equal, practices with positive business cases are more likely to spread successfully. A positive business case means a positive return on investment – not only that the financial benefits (cost savings, cost avoidance, or revenue enhancement) exceed the costs, but that the benefits accrue to entity that bears the costs, the benefits are as certain as the costs, the benefits develop in a reasonable time frame, and the potential benefits can be harvested into real “hard green” dollars (e.g. reduced admissions translated into decreased hospital costs per member).

Element Start-Up Well-Established

Costs (operating costs and start-up costs)

Substantial operating cost would require significant reallocation of resources

Modest operating costs can be covered within existing operations budgets, but start-up would require investment from other sources

Modest operating and start-up costs can be covered within existing operations budgets

No costs for implementation – changes work of existing staff rather than adding staff

No startup investment

Savings (cost reduction or cost avoidance)

No savings anticipated Modest savings are projected but has not been demonstrated

Substantial savings have been projected but not documented

Measurement of savings is less than robust (e.g. possible confounding, no trending, no comparison group, etc.)

Substantial savings have been documented

Robust measurement of savings, e.g. trended annotated run charts show significant improvement OR pre/post analysis with comparison group

Revenue (increased total revenue or revenue per member)

No revenue enhancement anticipated

Modest revenue enhancement is projected but has not been demonstrated

Substantial revenue enhancement is projected but not yet documented

Data on revenue enhancement is not robust (e.g. possible confounding, no trending, no comparison group, etc.)

Substantial revenue enhancement has been documented

Robust measurement of revenue enhancement, e.g. trended run charts OR pre/post analysis with comparison group

Return on Investment

Financial costs exceed financial benefits (the practice may still be justified on the basis of other benefits, e.g. compliance)

Financial costs are roughly equal to financial benefits

Financial benefits substantially exceed costs, but transfers would be needed to return the benefits to the entity that bore the costs

Financial benefits substantially exceed costs, and accrue to the entity that bears the costs

Certainty and Timing

Costs are certain but benefits are less certain

Benefits have been demonstrated as robustly as costs, but will accrue 3 or more years later

Benefits have been demonstrated as robustly as costs, but will accrue 1-2 years later

Benefits have been demonstrated as robustly as costs and will occur during the same budget year

Harvestability Harvesting potential benefits could require painful measures, such as closing facilities or eliminating positions

Translating potential benefits into real dollars would require no more than routine management efficiencies

Benefits could translate directly into real dollars, but other actors could undermine (i.e. contract hospitals could raise prices if KP utilization decreases)

Benefits would translate directly into real dollars (e.g. reduced drug costs)

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

4. Transferability

Research in and beyond health care has shown that promising practices are most likely to spread if they can be readily observed in a demonstration site at then piloted locally, are simple, can be adapted to local needs, fit with existing work culture and norms, and align well with leadership goals and strategies. Practices are more likely to spread further as/if they mature – being adopted and sustained by multiple sites and attaining reliable implementation among earlier adopters. Support structures and tools help accelerate transfer from site to site.

Element Start-Up Well-Established

Observability No pilot sites are available to observe OR benefits are not readily observable

Processes and benefits can be observed by potential adopters at 1 pilot site

Processes and benefits are readily observable at 2-4 pilot sites

Processes and benefits can be readily observed at scale in 2+ KP Regions

Simplicity Requires participation by 4+ units or functions (e.g. primary care, ER, and laboratory)

Requires participation by 2-3 units or functions; interactions must be negotiated and tested

Requires participation by 2-3 units or functions, but handoffs and accountabilities are clear and simple

Can be implemented within a single organizational unit and without broader modification of current delivery system

Adaptability Adaptations have resulted in failure to achieve results anticipated

Adaptation has occurred over time at 1 pilot site without compromising results

Adaptation has occurred, without compromising results, at 2+ diverse sites that adopted the practice

Key components are known and simple; the range of acceptable variation has been identified and communicated

Cultural Fit Implementation requires changing significant aspects of work culture and roles

Implementation requires some adjustment of work culture or roles, but no fundamental changes

Fits smoothly with existing work culture and norms

Fits smoothly with existing work cultures, and goes beyond to fit with staff hopes and desires

Goal Alignment Not clearly aligned with KP goals/strategies at national or local level

Directly supportive of lower-tier but not top-tier KP goals/strategies

Cascading Program/Regional/local alignment is missing or weak

Arguably aligned with top-tier KP goals/strategies, but impact is less than direct and substantial

Direct, measurable, substantial impact on one of KP’s top 10 goals/strategies

Leadership has provided an unambiguous message that the status quo is unacceptable, with clear Program/Regional/local alignment

Sustainability Not yet sustained for 6+ months at any KP site

Reliability and performance data are not available

Implementation sustained 6-12 months at 1+ KP sites

Performance is measured, but no control charts show reliability

Performance has been sustained for 1+ year at one site

80-90% reliability has been documented in control chart(s)

Data demonstrates sustained performance for 1+ year

95%+ reliability has been documented in control chart(s), with balancing measure(s)

Implementation Support

No change package is available

Pilot site champions are not readily available for consultation

No comprehensive change package, but sample tools and resources are shared

Pilot site champion(s) are available for consultation by phone

Change package is available, with tools, metrics, case studies, etc.

Pilot site champion(s) are available for on-site troubleshooting

IT tools are built but not transferable

Active knowledge management supports ongoing improvement; tacit knowledge transfer is underway among adopters

Decision support and work flow tools are available in KPHC or other systems

Overall (use judgment, based on all above)

1 2 3 4 5 6 7 8 9 10

Comments (Record here the biggest gaps to address and the greatest strengths to build on.)

Page 32: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

32

Try using the Readiness for Export tool – Scoring

Section Recommendations by Score

1. Impact on Primary Objective

1-4 Focus on improving performance and measurement at pilot site

Overall score: ____

Weakest element(s):

5-7 Begin assessing impact on other aspects of care delivery while continuing to improve performance and documentation

8-10 Focus your energy elsewhere (but sustain the gains; don’t let performance slip)

2. Impact on Other Aspects of Care

1-4 It’s time to look beyond your primary objective; bring in others with responsibilities for aspects of care that might be affected

Overall score: _____

Weakest element(s):

5-7 Strengthen documentation and/or measurement of impacts on other aspects of care

8-10 Focus your energy elsewhere (but keep looking for synergies)

3

4

Impact hasn’t been measured well enough

Is there any impact on Patient Experience?

Page 33: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

33

Interpreting the Readiness for Export scores

• This isn’t a pass/fail test Low ratings in some areas are an alert to challenges you

may face

• What you do with the scores depends on your role Importer: Consider a different practice? Or proceed with your

eyes wide open Exporter: Keep developing your practice; consider

partnering with others Distributor: Review your goals carefully; if you proceed

consider spreading slowly and embracing variation

Page 34: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

34

Embrace the “funnel” – Some innovations should spread (…some shouldn’t)

Keep Perspective

Be realistic about readiness for spread, and promote an innovation only when its value and transferability have been demonstrated

Assess transferability rigorously: trialability, simplicity, fit with KP culture, etc.

Evaluate!

Great idea

True success

Collaborative

Action Plans

Great idea

Page 35: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

Site Readiness-to-Import Assessment

Jim Bellows

Page 36: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

36

Why assess Readiness to Import?

Have you ever…

… tried to import a practice that was successful for the innovator, but you just couldn’t make it work at your site?

… tried unsuccessfully to interest others in a practice that seemed great to you?

… had a senior leader ask you to import/distribute a practice when: Your organization was focused on other goals? Leadership was not aligned, giving conflicting direction? People were dealing with significant changes or disruptions?

Solution? Due diligence – don’t conclude too quickly that your organization is ready to receive a practice from elsewhere, no matterhow good it seems

Page 37: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

37

Try it for your project!

Page 38: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

38

11 Key Components of Readiness-to-Import

Organization• Sponsorship & leadership• Oversight Infrastructure• Strategic Alignment with Organization’s Goals & Priorities• Cultural Readiness

Resources• Staff• Identified Project Management & Championship• Training requirements• Space• Technology Requirements• Operations Infrastructure• Measurement & Monitoring

Page 39: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

39

Sponsorship and Leadership

Key Component Definition

Rating Scale (0-4, see definition column and comments below)

Sponsorship & leadership Establish genuine

commitment and support for changes, rather than simple compliance

Get involved in the change, understand it, and promote it (Express, Model, & Reinforce)

Take personal responsibility and allocate sufficient time and resources to ensure the change is sustained

Trustworthy, influential, respected and believable

Consider the targeted sponsors for this initiative.

0 =No evidence that sponsor behaviors have been exhibited; no desire to sponsor this initiative

1 = Limited evidence of sponsor behaviors; limited desire

2= General evidence of sponsor behaviors, with inconsistent performance; some desire

3= Evidence of sponsor behaviors; desire to sponsor this initiative

4= Evidence of sponsor behaviors sustained over time; strong desire to sponsor this initiative

Page 40: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

40

Strategic Alignment with Goals and Priorities

Key Component Definition

Rating Scale (0-4, see definition column and comments below)

Strategic Alignment with Goals & Priorities

Change aligns with strategic priorities and the organizational goals

The specifics of what is being asked are clear, the benefits (including ROI) apparent, and the impact on affected department(s)/functional units defined

Consider the alignment of this initiative with goals and priorities, as well as impact on those affected:

0 = No alignment with priorities; impact on affected unit(s) is unclear

1= Some alignment with priorities OR goals; impact on the affected unit(s) is substantial given benefits

2 = Some alignment with priorities AND goals; impact on affected unit(s) is justifiable

3 = Adequate alignment with priorities and goals

4 = Complete alignment with priorities and goals; impact on affected unit(s) is minimal

Page 41: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

41

Technology Requirements

Key Component Definition

Rating Scale (0-4, see definition column and comments below)

Technology Requirements There is enough

technology of the right type to support the change

There is a commitment to budget for long-term maintenance and sustainability of the technology

Consider technology implementation and sustainability requirements:

0 = Requirements have not been adequately defined1 = Requirements have been adequately defined but there are significant budget gaps2 = Requirements adequately defined; some budget gaps3 = Requirements adequately defined; no budget gaps4 = Requirements adequately defined; no budget gaps; sponsor commitment to maintaining technology over time

Page 42: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

42

Try it for your project!

Page 43: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

43

Scoring the tool provides general guidance

Guidance on interpreting scores

Any score of 0 or 1 on a single “readiness” attribute: Strong consideration should be given to addressing these attributes prior to initiative implementation unless there is clear rationale while this attribute is not important for the success of the project

Total score of 22 or less: Strong consideration should be given to not proceeding on until the main drivers for this score are adequately addressed

Total score of 23 to 33: Makes sense to proceed with caution, addressing the trouble spots identified in this assessment

Total score of 34 or more: Indicate a high likelihood of success in terms of initiative implementation with appropriate considerations for any single low score as defined in bullet #1 above.

Use judgment in interpreting the scores and deciding how to proceed

Page 44: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

44

The Readiness Assessments can guide your decisions about spread

Innovate

Test and Replicate-WavesJust Do It

Test and Replicate -Collaboratives

Test and Replicate-Diffusion

Ali

gn

men

t

High

Transferability

High Low

Low

Source: Stacey 2002 Readiness for Export

Rea

din

ess

to I

mp

ort

Page 45: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

45

How much variation?Adapt locally vs. copy exactly

Adapt locallyTheory (Paul Plsek) Health care is a

Complex Adaptive System Find local Attractors Use only Simple Rules

Strength Spread is more likely to occur if

importers can adapt to their needs

Copy exactlyTheory (per Gabriel Szulanski) We’re not as smart as we think Experience beats cleverness First import, then improve

Strength Spread is more likely to get

results if importers work with exporters to learn a proven model

Page 46: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

Measurement and Feedback for Spread

Lisa Schilling

Page 47: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

47

Measuring Spread

• Rate of adoption

• Practice reliability map across sites

• “Energy map” of initiatives across sites

• Outcomes

Page 48: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

48

Rate of Adoption – Sustainability and Penetrance

Page 49: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

49

Rate of Adoption Multiple Ideas

Iowa Health System: 10 Hospitals in Iowa and IllinoisSystem-wide Diffusion - Exec Walk Arounds

0

1

2

3

4

5

6

7

8

9

10

J un-01 J ul-01 Aug-01 Sep-01 Oct-01 Nov-01

LS1

Dec-01 J an-02 Feb-02

LS2

Mar-02 Apr-02 May-02 J un-02 J ul-02 Aug-02 Setp 02 Oct-02 Nov-02

Fac

iliti

es p

arti

cip

atin

g

Iowa Health System: 10 Hospitals in Iowa and IllinoisSystem-wide Diffusion - Hazard Areas - At Least 1 per facility

0

1

2

3

4

5

6

7

8

9

10

J un-01 J ul-01 Aug-01 Sep-01 Oct-01 Nov-01

LS1

Dec-01 J an-02 Feb-02

LS2

Mar-02 Apr-02 May-02 J un-02 J ul-02 Aug-02 Setp 02 Oct-02

# IH

S fa

cilit

ies

Iowa Health System: 10 Hospitals in Iowa and IllinoisSystem-wide Diffusion - Unit Briefings

0

1

2

3

4

5

6

7

8

9

10

J un-01 J ul-01 Aug-01

Sep-01

Oct-01

Nov-01 LS1

Dec-01

J an-02

Feb-02LS2

Mar-02

Apr-02

May-02

J un-02

J ul-02 Aug-02

Setp02

Oct-02

Nov-02

# fa

cilit

ies

usi

ng

in 1

+ o

r m

ore

un

its

System-wide Diffusion - Medication FMEA

0

1

2

3

4

5

6

7

8

9

10

J un-01 J ul-01 Aug-

01

Sep-

01

Oct-01 Nov-

01 LS1

Dec-

01

J an-

02

Feb-

02 LS2

Mar-

02

Apr-

02

May-

02

J un-

02

J ul-02 Aug-

02

Setp

02

Oct-

02

Nov-

02

# fa

cilit

ies

com

plet

ed F

MEA

Source: IHI, Iowa Health system 2010

Page 50: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

50

Before: Monitoring Reliable Practice Across Sites

Date: 4/18/2006

Hospital: Moa

Shift:

WAVE 1 1 1 1 2 2 2 2 2 3 31 East 1 West 2 East 2 West 3 East-Tele 3 West-Tele 4East M/B Malama W Malama E Peds

% Yes % Yes % Yes % Yes % Yes % Yes % Yes % Yes % Yes % Yes % YesComponent # Questions

SP 1CNWas the staffing assignment complete before your arrival on shift?

100% 100% 67% 100% 100% 100% 67% 100% 0% 100%

SP 2CNWas patient care information printed/prepared before you came on shift?

67% 100% 67% 67% 100% 100% 100% 100% 100% 67%

SP 1Do you know the name of the nurse who took care of your patients on the previous shift?

93% 100% 86% 100% 70% 83% 92% 78% 100% 82%

SP 2Was patient care information report printed prior to your arrival?

93% 100% 71% 92% 70% 83% 92% 78% 92% 55%

SP 3Was the information in the kardex and neuron in agreement at the beginning of the shift?

93% 93% 50% 75% 70% 58% 92% 72% 62% 27%

G 4 Is a patient care board available in your room?100% 100% 100% 100% 80% 83% 92% 78% 62% 45%

G 5Was the plan of care written on the board from the prior shift?

100% 79% 21% 42% 10% 17% 58% 61% 8% 0%

B 6 Did shift change happen face-to-face?100% 100% 100% 100% 80% 92% 100% 94% 77% 27%

B 7 did shift change happen at the bedside?100% 93% 79% 83% 30% 67% 67% 83% 0% 0%

B 8 Did you receive report in ISBAR format?86% 50% 71% 92% 40% 75% 58% 44% 0% 9%

B 9For patients who could have teachback, did you do patient teach-back during the oncoming report?

50% 29% 21% 42% 20% 42% 25% 33% 0% 0%

B 10Was the patient's understanding of the plan similar to your plan of care?

86% 86% 93% 92% 80% 75% 75% 83% 31% 0%

B 11Was the goal for plan of care achieved from the previous shift?

93% 93% 71% 75% 40% 83% 75% 83% 69% 0%

B 14Do you plan on giving shift change report in ISBAR format?

79% 64% 86% 92% 40% 67% 58% 67% 15% 9%

Pink = shift preparationOrange= Goal boardYellow= bedside round with patient teach back use

Source: KP Hawaii NKE 2007

Page 51: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

51

Date: 8/22/2006

Hospital: Moa

Shift: All Shifts

WAVE 1 1 1 1 2 2 2 2 2 3 31 East 1 West 2 East 2 West 3 East-Tele 3 West-Tele 4East M/B Malama W Malama E Peds

% Yes % Yes % Yes % Yes % Yes % Yes % Yes % Yes % Yes % Yes % YesComponent # Questions

SP 1CNWas the staffing assignment complete before your arrival on shift?

100% 100% 33% 100% 100% 100% 100% 100% 100% 33%

SP 2CNWas patient care information printed/prepared before you came on shift?

100% 100% 33% 100% 0% 100% 100% 100% 100% 33%

SP 1Do you know the name of the nurse who took care of your patients on the previous shift?

100% 89% 91% 100% 100% 67% 100% 82% 100% 100%

SP 2Was patient care information report printed prior to your arrival?

92% 100% 73% 100% 100% 100% 100% 64% 70% 55%

SP 3Was the information in the kardex and neuron in agreement at the beginning of the shift?

77% 100% 64% 100% 100% 67% 70% 91% 90% 91%

G 4 Is a patient care board available in your room?100% 100% 100% 100% 100% 100% 100% 91% 90% 100%

G 5Was the plan of care written on the board from the prior shift?

69% 100% 82% 77% 100% 100% 80% 64% 30% 45%

B 6 Did shift change happen face-to-face?100% 100% 100% 100% 100% 100% 100% 100% 100% 100%

B 7 did shift change happen at the bedside?92% 67% 73% 85% 86% 67% 20% 82% 90% 73%

B 8 Did you receive report in ISBAR format?54% 100% 73% 92% 100% 100% 70% 91% 60% 45%

B 9For patients who could have teachback, did you do patient teach-back during the oncoming report?

23% 100% 55% 77% 86% 100% 10% 55% 0% 27%

B 10Was the patient's understanding of the plan similar to your plan of care?

100% 100% 100% 92% 100% 100% 40% 91% 60% 82%

B 11Was the goal for plan of care achieved from the previous shift?

92% 100% 82% 100% 100% 67% 50% 91% 70% 82%

B 14Do you plan on giving shift change report in ISBAR format?

77% 100% 100% 85% 100% 100% 80% 91% 60% 64%

After: Monitoring Reliable Practice Across Sites

Pink = shift preparationOrange= Goal boardYellow= bedside round with patient teach back use

Source: KP Hawaii NKE 2007

Page 52: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

52

“Energy Map” Sacramento/Roseville

9-Apr-10

Resource Management KH A A A A A A A A A A AOR Throughput RD/AMH A A P A A A P SHAPU RD S S S S S S S S SFalls RD S S S S S S SNKE/Service/Rounding KBS P P P P P P P A A A PWorkplace Safety TA A A A A A A A A A A A A A A A AEliminate Infection/C-diff RD P P P P P P PRFO RD S SP SP A SMedication Errors SC/LP P P P P P P P P PPurchasing Supplies MD P P PEmployee Morale RP A PCore Measures JG PDiversity Project TN P P P P P P P A P P AWorkflow/6S RD/KJ A P AErgoNurse PI Project TA P P P P P P PSepsis PI Project DF A A ALeadership Development TO PAttendance RD PEmpathy RD P P P P PBereavement JS A A A ALean TPMG A A

STATUS

PRIM+, PIL 1 & PIL 2 Training AUBT consultants SSurgical Safety Summits SP

Mentoring P-T IAs

IA Initials

Spread

FOLPACUADT

VolSvs

Spir'tCare

MbrSvs

UBT lauches

Quality Department Workflow

Planned

Active

Sustain

2N 2S ICU OR3N 3S

NCAL IA Peer Group

PCCSW

PreOpPACU SPD

MCHL&D MM

Advising/Teaching Committees

Rx

MCHPICUPED

MCHMB

Consultants Partnership

COSQO Patient Quality

"A" Team

ORCC/HRST

EVSFOLASU TPMG

RCOESCHR

National/Quality Conferences

Performance Improvement Portfolio Management GridRoseville Medical Center

Lead IAPROJECTMCHNICU1S1N ED

ProcSedn

Source: Ryan Darke 2010

Page 53: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

53

Outcomes: Adverse Drug Event Rates

Iowa Health System Adverse Drug Events: % of Sampled Charts with Harm Levels ADEs E-I

Targets: 2002 = 10%, 2003 = 4%

20%

14%

9% 9%10%

6%4%

6%5% 5%

3%

6% 6%

10%11%

2%

10%

6%

3%4%

0%

10%

20%

30%

N-01 D-01 J -02 F-02 M-02 A-02 M-02 J -02 J -02 A-02 S-02 O-02 N-02 D-02 J -03 F-03 M-03 A-03 M-03 J -04

% o

f C

ha

rts

'03 Target 4%

Reduced Sample Size

Aim: 50% Reduction in ADEs System-wide in 2002

Source: IHI, Iowa Health System 2010

Page 54: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

54

Outcomes: Mortality Rates

Page 55: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

55

Exercise for your portfolio

• In planning spread what variables do you need to monitor over time?

• Which ways would you monitor and report progress of your spread effort?

Page 56: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

Support a Learning Culture

Jim Bellows

Page 57: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

57Source: Institute for Healthcare Improvement, 2006.

What is the biggest part of this model?

Page 58: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

58

Practices spread best through personal contact

Page 59: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

59

Who do you go to when you need information or support?

407

406

405

404

401400399

398397

396

392

391

389 388

383382

370369

368

367

366

365

362

358

347

346

343

330

329

328

327

326325

323

319

311

309

308

307

306

305

304

303

299

296

295

292 286

285

282

281

280

273

272

271

270

269

268 267

260

258

257

255

248

247

241 240

239

238

237

233

232

231

230

229

228

227

226

216

215

214

213212

209

205

204

203

198197

196

195

194193

192191

190

189

188

187

186

185

184183

182

181

177 176

174

173

170

169

168

161

160

159

156

155

154

153

152

151

150

149148

145

144

139

138

137136

135

130

129

128

127126125

121

109

106

105

104

103102

101

098

097

093

092

091

090

089

088087

086

085

084080

079

078

069

068

067

066

065

059

058

057

053

052

051

050

045

044043

038

036

035

034

033

032

031

030

029

028

027026

025

024

023

022

021

020

019

018

017

016

015

014

013

012

011

010

009008

007

006

005

004

003

People’s answers define a social network map

Key nodes are not necessarily formal leaders

Page 60: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

60

Social networks take work

• Communicate 6 times x 6 ways

• Foster relationships Get people together

• Send importers to meet with exporters

Page 61: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

61

Knowledge Management – Moving learnings through social networks

• Content Case studies, especially patient cases Stories – what seemed to work, what didn’t Evaluation results

• Structure and process Informal exchange Face to face visits and meetings Wikis, IdeaBook, SmartBook, …

Page 62: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

62

Rapid spread of complex change: A case study in inpatient palliative care

BMC Health Services Research 2009, 9:245

Della Penna R, Martel H, Neuwirth EB, Rice J, Filipski MI, Green J, Bellows J

Results: Compelling evidence of impacts on patient satisfaction and quality of care generated ‘pull’ among adopters, expressed as a remarkably high degree of conviction about the value of the model. Broad leadership agreement gave rise to sponsorship and support that permeated the organization. A robust social network promoted knowledge exchange and built on an existing network with a strong interest in palliative care. Resource constraints, pre-existing programs of a different model, and ambiguous accountability for implementation impeded spread.

Conclusions: A complex, hospital-based, interdisciplinary intervention in a large health care organization spread rapidly due to a synergy between organizational ‘push’ strategies and grassroots-level pull. The combination of push and pull may be especially important when the organizational context or the practice to be spread is complex.

Page 63: Getting to Scale: Spread IA Graduate Seminar, May 18, 2010 Lisa Schilling RN MPH VP, Healthcare Performance Improvement Jim Bellows, PhD Senior Director.

63

How can you reach your spread goal?

• Identifying social networks and communicating through them

• Establishing channels for knowledge management and creating relevant content