Where next for improving and evaluating patient safety?
description
Transcript of Where next for improving and evaluating patient safety?
Where next for improving and evaluating patient safety?
Dr. Dale WebbDirector of Evaluation & StrategyThe Health Foundation
2
Future challenges for patient safety
3
Loss of prominence in the NHS?A ‘loss of momentum’ and ‘vacuum’ following the demise of NPSA, Patient Safety First and changes at the NHS Institute
“There is a lot of fog out there … I don’t hear clear messages … there’s no sensible narrative for people in the service” (Medical champion for patient safety at national level)
“I’m concerned that patient safety will be lost off the agenda in the NHS reorganisation. If it doesn’t drop off consciously it will be compromised in the way that people cut costs” (Medical champion for safety)
“We are lucky in Wales: our annual operating framework keeps patient safety on the agenda” (Medical Director, Acute Trust)
The Scottish Patient Safety Programme had “helped on a number of levels – both at clinical and board level” (Medical Director, Acute Trust)
4
Other challenges for patient safetyStrengthen the evidence base and build the link between safety and productivity:
“What is the common currency for everyone at the moment? It’s finance. That safe care means cheap care. We need evidence to support the ‘invest to save’ argument” (Medical Director, Acute Trust)
“Let’s not over-exaggerate the case for quality improvement” (Medical champion for safety)
Strengthen organisational capacity for continuous quality improvement
Spreading and embedding safety improvements
Exploring the role of patients and the public in creating safer healthcare
5
Implications for the Health Foundation’s safety work
6
Sustaining a safety focus in hard timesSafer Patients Network, launched in June 2009, aimed to be a self-sustaining, member-driven group of organisations. No longer realistic to continue with this ambition in the current financial climateWe will commit to support a broader community of safety-minded organisations to maintain and develop their focus on safety during the challenging times ahead– Establish a managed virtual network– Open to membership across the UK– A go to place for resources, access to expertise and connections to peers– Provide the interface between regional & national initiatives, promoting
shared learning– It will clearly signpost specific areas of interest (communities of practice)– Develop a wide range of active support to leaders, managers and frontline
staff– Focus on spreading and developing innovative approaches
7
Sustaining a safety focus in hard timesRolling programme to introduce new ideas and maintain momentum– Interest-group based webinars and podcasts– ‘Expert in residence’ programme– ‘Local team in spotlight’ programme– Annual learning event– Community of practice events– Skills marketplace/’time bank’– Explore involvement of patients and the public
A clinical system that delivers value to the patient, is demonstrably free from unacceptable levels of risk and has the resilience to withstand normal and unexpected variations and fluctuations
Safer Clinical Systems
8
Our research shows poor reliability
9
• Failures in reliability pose real risk to patient safety 15% of outpatient appointments affected by missing clinical information
• Important clinical systems and processes are unreliable Four clinical systems measured had failure rate of
13%-19%
• Wide variations in reliability between organisations
• Unreliability is the result of common factors Lack of feedback mechanisms and poor
communication.
• It is possible to create highly reliable systems
The Health Foundation May 2010
Building Safer Clinical Systems
Define and describe the system & its context
Assessing and diagnosing hazard & risk
Testing & implementing solutions
10
11
Award holders are taken through a systematic approach which will involve:• A tailored learning and development programme• Expert help• On-site support facilitated by a named person• Peer-review• Opportunity to review progress at key intervals• Central learning events
Key Features
Phase 2Systems approach allows teams to identify and address those parts of a patient pathway that exposes patient to potential harm. It explores factors outside the clinical pathway that affect the care provided, including organisational context In Phase 2 we will be working within patient pathways, focusing on these supporting processes and systems:
Safe, reliable prescribing in patient pathways (e.g. prescribing by staff throughout the pathway, together with upstream processes such as information transfer and downstream administration)
Safe, reliable clinical handovers in patient pathways (e.g. transfer of clinical information, tasks, responsibility and authority)
12
13
Other areas of work
The role of corporate services in closing the gap and improving quality and safety
Scoping work in primary care
Role of patients
14
1) Do these challenges resonate for you?2) What do you think about our network plans?3) What’s missing?
15
Challenges in evaluating efforts to improve safety and quality
16
Tensions between evaluation & quality improvement
The ‘treatment’ is social change
The theoretical and empirical foundations of QI
Data for improvement v. judgement
Flexible v. null hypothesis
Real-time v. static data
17
The RCT faultline
A faultline that appears elsewhere:
Critique of positivism in the 1970s and 1980s
Health promotion’s response to evidence-based medicine in the early 1990s
18
New evaluation methods or new evaluative mindsets?
19
Evaluation is a fractured discipline
Schools include:
ExperimentalHermeneuticFourth generationRealisticComplexity Theory-basedUtilisation drivenStakeholder-basedValues based
… and a dozen others!
20
Pretenders to the throne
Realistic Evaluation
Theory of change
Complexity Theory
Process Normalisation Theory
21
Incommensurable paradigms?
Dominant tendency in the literature for plurality/non-integration
Small number argue for a more cumulative approach to scholarship
Boundary conditions of individual theories and methods
A metatheory of evaluation?
22
New methods are not what we need most
We do need more synoptic, integrated, approaches
To do this, we need a different mindset – a different perspective
23
24
Data for judgement and improvementMeasurement for improvement: • provides visually compelling real-time data and, critically, variation around the
mean time• Used well (and carefully annotated to note when changes to care take place), this
can help to understand and explain the transformative characteristics of adaptive systems
Measurement for judgement:• The counterfactual question ‘what would have happened anyway if the
programme hadn’t existed?’ is important if we are determine the ‘additive effect’ of a programme above and beyond over things going on in the system of care, but only when conditions permit
25
Build intervention theory • Funders, QI technical providers and evaluators working together
• Dosage, time frame, tempo, locus of activity
• Stretch goals v. evaluative goals
Benefits:
• Makes clear whether we are truly in innovation, demonstration or scale up mode
• Ensure we have a realistic expectation at the outset of likely impact relative to the dose, and that measurements are being taken at the specific places where we expect to see an effect
• Consensus on the levels of evidence and likely impact of the changes that are being proposed
26
27
Recover ontological depthWe need to understand how institutional and other contexts frame decisions and actions– Real time knowledge capture about the implementation process– Evaluation ‘getting under the skin’ of the intervention
Causal description: a description of the consequence attributable deliberately to varying a treatment &
Causal explanation: the mechanisms through which and the conditions under which that causal relationship holds (Shadish et al., 2002: 9)
28
Opportunities for collaboration?
‘Think pieces’ to identify points of connection
Develop new thinking about the design and evaluation of QI interventions
Journal supplement on the science of evaluating improvement
29
Let’s not throw the baby out …
30
1) Do you agree with me?!
2) How mature are the approaches used to evaluating safety
interventions?
3) How should the evaluation community be positioning
itself in relation to safety & quality?