Evaluation This PPT and other resources from: 1 John.

70
Evaluation This PPT and other resources from: http://homepage.mac.com/johnovr/FileSharing2.h tml 1 John Øvretveit, Director of Research, Professor of Health Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden 03/27/22

Transcript of Evaluation This PPT and other resources from: 1 John.

Evaluation

This PPT and other resources from:http://homepage.mac.com/johnovr/FileSharing2.html

1

John Øvretveit,Director of Research, Professor of Health

Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden

04/19/23

04/19/23 2

POINT 1 Evaluation means different things to different people

04/19/23 3

Evaluation definition (Øvretveit 1998,2002)• judging the value of something • by gathering information about it • in a systematic way • and by making a comparison, • for the purpose of making a better informed

decision.

04/19/23 4

What I will cover• Who is the evaluation for and their questions?• Three approaches• Their answers to different questions• Their ways of maximising validity• Theory-driven case evaluation• When and how to document and study context

04/19/23 5

Examples

• Standardisable treatment or change to organisation• New chronic care model• Breakthrough collaborative• Joint commission accrediatation

04/19/23 6

Who is the evaluation for and their questions?User focused evalaution

• Who is the main customer for the evaluation and their questions?

• Design you evaluation to give the information they need to make the decisions they need to make

Vs literature based focus• The evaluation is to fill gaps in scientific

knowledge

04/19/23 7

Key questions for evaluations • Does it work?

• (outcomes caused by the intervention)

• Would it work here locally?

• How did they implement this change? (description)

• Which context factors helped and hindered implementation (attribution)

• In which range of settings and conditions did it work? (generalisation certainty)

• How do I adapt it, or the context, to implement it? (adaption)

04/19/23 8

Three approaches answer different questions

• Controlled experimental• Does it work?

• Quaisi experimental• Does it work (less certain)

(Easier and less costly)

• Theory informed case study evaluation• Does it have effects which may lead to patient

outcomes?• How does it work?

04/19/23 9

3)RCT Experimental

Intervention:

Before Measures

Number of patientsassigned to intervention:

Comparison

Number of patientsassigned to placebo

After MeasuresWhich and when?

Length of time of intevention:

How people were selected(before random allocation:

What effect,compared to control group?

04/19/23 10

3)Experimental intervention: Before after single case

What effect?

Confounding variables and controls:what, apart from the intervention, could haveproduced the change in the measures?

People or org beforePeople or org after:

DataWhich and when?

Intervention:

Length of time of intevention:DataWhich and when?

How people were selected:

Time series (multiple before/after)

Stepped wedge design

04/19/23 12

04/19/23 13

Case study uses model of chain of effects

Programme theory

ideas about sequence of actions and situation factors leading to intermediate changes and ultimate outcome

Cretin et al 2004 model

VHA advanced access – evaluation model by COLMR

14

04/19/23

.

• .

15

Naturalistic methods PSI study (UK 100k collaborative)

1a

16

04/19/23 17

Their ways of maximising validityInternal validity of the evaluation

• = how certain are we that the outcomes are due to the intervention and not something else? (attribution)

• Experimental: Control for other explanations• Comparison no-intervention patients or providers• Time trends

• Case study: causal chain, multiple data

04/19/23 18

External validity of the evaluation• = with the intervention on other patient or providers, how

certain are we that we would find the same outcomes (generalisation)

• Experimental: • repeat the evaluation with other targets, or use

“representative” targets

• Case study: causal chain diagram is a theory which allows decision makers to think through if it would work in their setting from ONE CASE

• Analytic generalisation (not statistical)

04/19/23 19

The importance of context

04/19/2320

Can you grow pineapples in Sweden?

Seed

Gardener/planting & nurtureClimate / soil

Change ideaEvidence0-5?

+ Context 0-5? - Local - Wider

+ Implementation actions0-5?

Your change?

Distinguish

21

INTERVENTION

Seed

IMPLEMENTATION STRATEGIESPlanting

CONTEXT

Soil and climate

Effective treatmentEducationGuidelinesAudit and feedbackAcademic detailing

Organisational structureCultureSystemsFinancial system?

Effective organisationEg team organisationCare manager

Or A change idea

Breakthrough collaborativeImplementation network

Organisational change supportCultureFinancial incentives

Which effective for which intervention?Classification of strategies?

Which features help and hinder which strategies/support which interventions?

French 2009 review - factors affecting implementation success

Categories and measures

• Climate:, e.g., openness, respect, trust• OL culture• Vision• Leadership• Knowledge need• Acquisition of new knowledge• Knowledge sharing• Knowledge use 2

2

04/19/23 23

How do you take account of context?

• Experimental • You don’t. You get rid of its interference• Should describe intervention, implementation and

settings

• Case study• Before: Define context factors which might influence

implementation - Example in a collaborative?• Collect data on these and assess influence• Build model/theory with context not just causal chain

04/19/23 24

Summary• Who is the evaluation for and their questions?• Three approaches• Their answers to different questions• Their ways of maximising validity• Theory-driven case evaluation• When and how to document and study context

04/19/23 25

Your reactions and questions

1. Any surprises…

2. Not certain about…

3. This could be useful…

04/19/23 26

DETAILS

DETAILS

Questions and criteria

27

EfficacyDoes it work (anywhere)?Better than….

Certainty about effects (internal validity) Key issue = Attribution

EffectivenessWhat are the effects in typical settings?

Certainty about effects on intermediate and ultimate outcomes (external validity) Key issue = Generalisation

ImplementationSellStartSustainSpread externally

What were the actions taken to make the changes? Which conditions helped and hindered? Key issue = Description and explanation

.

28

04/19/23

Pain medication

Does it work? Standardise the treatment to each patientCompare to no treatment/placeboExclude other factorsMeasure outcomes – are changes to these measures associated with presence/absence of the treatment?

Pain medication improvement

Does it work Are systems and supports changed?Do providers change their behaviour?Are patient outcomes better

Describe the interventionMeasure stages in the causal pathway

Implementing a change

What did they do?Was this strategy effective to get the change?

Details of actions taken at different times (method/strategy for the change)

Conditions which help and hinder

Could we do this? Which influences help and hinder ordinary services to make the change(patients similar, organisations different)

04/19/23 29

What I will cover• How evaluation is similar and different to research and

monitoring• Challenges all research faces

• And how different evaluation designs address these

• Naturalistic Non-experimental evaluation• Programme evaluation• Case study evaluation• Realist evaluation

• Example of evaluation of HIV/AIDS programme in Zambia

04/19/23 30

EVALUATION

Evaluation

This PPT and other resources from:http://homepage.mac.com/johnovr/FileSharing2.html

31

John Øvretveit,Director of Research, Professor of Health

Innovation and Evaluation, Karolinska Institutet, Stockholm, Sweden

04/19/23

04/19/23 32

All research has these five challenges• Users wants vs evaluators views about what is

important• My focus is on user driven research but theory informed – data

driven by users decision

• Data validity• Are the data we collect valid and reliable? Reducing

data bias. Replicablity

• Cost of data gathering and its value for the evaluation users

• How much extra value for these extra data?

04/19/23 33

All research has these challenges• Attribution

• How do we know the outcomes were due to the intervention and not something else?

• Generalisation• To which other patients, organisations or settings do we

have confidence that the same findings might be observed?

04/19/23 34

Three types of evaluation• Experimental controlled - outcome

• Compare those getting the intervention with another group

• Experimental no controls - outcome• Only look at those getting the intervention

before and after

• Naturalistic – describe and document different impacts

04/19/23 35

3)Experimental intervention: Comparative case

Intervention A

Before Measures

Number of patientsreceiving intervention A:

Intervention B

Number of patients receiving intevention B:

After MeasuresWhich and when?

Length of time of intevention:

How people were selected for each intervention:

What effect, compared to similar intervention?

04/19/23 36

3)RCT Experimental

Intervention:

Before Measures

Number of patientsassigned to intervention:

Comparison

Number of patientsassigned to placebo

After MeasuresWhich and when?

Length of time of intevention:

How people were selected(before random allocation:

What effect,compared to control group?

04/19/23 37

Next – experimental no control groupjust before after outcome

Single B/A intervention to patients

Single B/A intervention to provider

04/19/23 38

3)Experimental intervention: Before after single case

What effect?

Confounding variables and controls:what, apart from the intervention, could haveproduced the change in the measures?

People or org beforePeople or org after:

DataWhich and when?

Intervention:

Length of time of intevention:DataWhich and when?

How people were selected:

04/19/23 39

3)Evaluation of Intervention to a service:-Impact on providers

\Number beforeNumber after:

MeasuresWhich and when?

Length of time of intevention:

Confounding variables and controls:what, apart from the intervention, could haveproduced the change in the measures?

MeasuresWhich and when?

:

Intervention to aserviceService

providersbefore

Serviceprovidersafter

04/19/23 40

3)Evaluation of Intervention to a service:Impact on patients

Intervention to a health organisation

(eg training, or a new computer system)

Health organisation orpersonnel before

Service impact on patients before the intervention

Measure Measure

Changed service

Measure 1 Describe Measure 2

Patient group 2

Patients

Patient group 1

Patients

Measure Measure

Service after the interventionService before

Health organisation orpersonnel after

04/19/23 41

Next – non-experimental process or naturalistic designs

• Describe the intervention• Eg a new service for people with chronic disease –

multiple components

• How the service evolves and why• Some effects (in the pathway towards

outcomes)• Eg staff practice and work organisation changes,

attitudes

04/19/23 42

Change chain or Influence Pathway“Programme theory” or “Logic Model”

1)Intervention - training on baby health careST Result: changes nurse’s knowledge, skills motivation

>>>2)Nurses then train mothersMT Result: changes mother’s knowledge,

skills motivation

>>>3) Mothers then behave differently

LT Result: baby health better

What was the intervention (three)Which intervention should you evaluate? HowWhat is the outcome of the intervention? (three)Point - find out if each intervention carried out fully, and

results

Model

Helping Hindering

Personnel are given time Shortage of personnel

for the education

Action >>>> Change 1 >>>> Change 2 (eg education about how (personnel do better Dementia

to assess dementia) dementia assessment) onset slower

Indicator of this? Indicator of this?

43

04/19/23

04/19/23 44

Point

Many things we evaluate have change chains

One thing changes another then this change changes another thing

Not just a drug treatment (one intervention)

But many interventions, sometimes in sequence

04/19/2345

Case studiesProgramme theory and concepts for describing changeProgramme theory

ideas about sequence of actions and situation factors leading to intermediate changes and output and outcome changes (their theory, our theory)

Cretin et al 2004 model

04/19/2346

MRC CSI safety research model (Brown 2008)

.

• .

47

04/19/23 48

Time line 20042003

Researcher gathers data to test hypotheses, often with a survey

Raw data

Specific Hypotheses

Theory

Box is the subject area or sample of people

3)Deductive hypothesis-testing (non-intervention)

Study start Study finish

Analyse data

Revise theory

04/19/23 49

For key questions to plan an evaluation

• What is the intervention? • Who is the evaluation for?• Which data do they need about the

intervention, its effects and the situation?• How do you know the effects are due to the

intervention and not something else?

04/19/23 50

Does your research study an intervention?

• What is the intervention? • what are the different implementation

actions you (or others) are taking • at different times?

04/19/23 51

The intervention

Intervention to patients (already studied and evaluated)

Intervention to providers (professionals and organisations)

• What was done to make a change, which would not otherwise have been done (or planned)

• Define by describing 2-8 component actions and when each was started and finished (or ongoing)

04/19/23 52

Next - clarify the “user” of the evaluation• Who is the evaluation for, to inform which

decisions?= evaluation criteria= what you measure/collect data aboutto inform the user

• The more users you try to satisfy, the less you will satisfy any

• 1-4 outcome measures max• This decides your outcome measures/data

04/19/2353

“BIDE”

Beneficiary >>>> >>>>Beneficiary

(april 2006) (sept 2006)

(Data) Difference? (Data)

Comparison

Explanation?

Intervention?

04/19/23 54

Reduce other explanations (confounders)

• By design (comparison site/group)• By listing other explanations and discussing

evidence for and against their influence or plausibility of influence

04/19/23 55

Summary

• What is the intervention? • Who is the evaluation for?• Which data do they need about the intervention, its

effects and the situation?• How do you know the effects are due to the

intervention and not something else?

Reference: Øvretveit, J (2002) Action Evaluation of Health Programmes and Change A handbook for a user focused approach, Radcliffe Medical Press, Oxford. (Publication of the year award 2002, EHMA and Baxter Healthcare and BMJ medical books award)

04/19/23 56

ConclusionsEach person write down and then share in the group:1. This was new or surprising, for me…

2. The most useful idea for my work and organisation was…

3. What I would like to find out more about…

04/19/23 57

DETAILS

58

Evaluating CSIs not standardisable SPs Not beta-blocker effects on patients,

But RRT or HIT effects on organisation and

behaviour (which may then affect patients)

Or 5m lives campaign effects on organisation and

behaviour Different target (participants) Intermediate outcomes and causal chain (influences) Evolving intervention

Separate “the change” from the implementation actions

59

The “before after” change (after implementation)

Making the change (the implementation actions)

1 Clinical safety/quality practicesEg does medications reconciliation reduce medication errors?

2 Implementing practicesEg is training effective for establishing medications reconciliation?

3 Generic changes to organisation To solve multiple problemseg EMR/CPOE/CDS, read-back, transfer, safety walk rounds.

5 Impact of system-wide or policy changes

Specific subject – does financial incentive for low infection rates reduce rates?Generic – is accreditation effective for reducing avoidable patient harm?

4 Implementing generic changes to organisation Training and protocols for safety walk rounds

6 Implementing system-wide or policy changes

Main research challenges Which data? – objectives and questions

Programme theory and causal chain

Describing the programme actions and changes at different levels – programme, service

organisation, project team

Not one action but series over time – phases in an evolving programme

Discovering intermediate changes and outcomes degree of certainty

Discovering factors helping and hindering the actions taken at each level: national, service organisation, project team/unit. Different factors at start, during and to sustain.6

0

Main research challenges

Because it is a complex dynamic social programme evolving over time in interaction with its context with many levels of action and feedback loops with delayed reactions

Choice – force fit it into a linear cause-effect study design,

or try to capture and describe the complexity but provide useful lessons for the future? 6

1

Research issues1) Cost-effectiveness different if, 3 years later,

1) the results are back to before the project,

2) people do not use what they learned.

2) Sustainability is a measure of how deeply the programme has penetrated into the receiving organisations – the extent to which quality improvement has been embedded.

Later – some relevant previous research to decide which data to collect to assess likely sustainability:

62

So, these cannot be evaluated?Two alternative research designs

1) Process evaluation, parallel to controlled trial Second descriptive study of the intervention implementation,

intermediate organisation and behaviour changes, and context

Check intervention protocol followed and what helped and hinder

(“light version”: add squire reporting data to report)

Strengths Helps Iowa manager see what and how implemented in the

study context. Helps develop explanation and theory of causal chain

Weaknesses: expensive, RCT time consuming 63

Case study data gathering framework

Context

– International, National, Local, Organisation, Unit, Group

The idea/concept/plan

Implementation actions

Intermediate impact organisation /behaviour changes

Patient/cost outcomes

64

04/19/23 65

case studies

Programme theory and concepts for describing changeideas about sequence of actions and situation factors leading to intermediate

changes and outcome changes (their theory, our theory)

Cretin et al 2004 model

04/19/23 66

2005 2006 2007 2008 2009& predictions for 2010

Actions: planning and preparations

Result: eg new organisation

Planning First implementation

Further development actions to create detailed organisation and behaviour changes

Context factors help and hinder implementation at different timesGovernment policy helps planning

Intermediate results: eg change in procedures and personnel behaviour

2006 2007 2008 2009

Results : eg fewer errors

Results: Better patient/cost outcomes

Current debate – best methods to address this: Complex interventions

Multiple component: synergy?

Sequential withdrawal

Implementation adaption/evolution

Context interaction/dependency

Multiple consequences & attribution

External validity – usefulness (reference glasgow et al 2007 (author checklist)

Use by decision makers – supply and demand

67

04/19/23

Limitations of these observational designs

Limited certainty about results Observers perceptions (interviews) Increase by combining interview, documents and other data

(triangulation). Use causality attribution principles

Cause before effect Causal chain or influence pathway model – intermediate changes

in the pathway Co-variation, necessary and sufficient condition Identify and assess alternative explanations Use comparison organisations where possible

04/19/23 68

Limitations for these observational designs - Generalisation

1)Findings unique? to intervention implemented in that organisation at that time in that

contex

2) do this and you get these results anywhere

3) Mid-way = Generalise - by giving exceptions

qualified generalisation Strategies to suggest findings expected in this range:

similar findings up to X variation of the intervention in Y organisations in Z setting

Findings not expected if intervention varies more than X and in Y situations…

Use theory or empirical data from forced testing to disconfirm

04/19/23 69

Use strategies to address these research challenges

Data Reduce bias & maximise validity

Attribution Increase certainty

Generalisation Identify range of situations findings likely to apply to

Who cares anyway Plan for users questions and communicate

70