Demystifying Evaluation
-
Upload
skillsforhealth -
Category
Leadership & Management
-
view
75 -
download
0
Transcript of Demystifying Evaluation
Our aim in this ‘rapid’ format
This webinar will give you a rapid outline of some of the core principles
of practical evaluation activities.
• Consider the importance of evaluation and implications of not
evaluating
• Understand the key concepts of evaluation
• Start to look at tools to help you
• Examine practical ways of measuring success
It’ll be quick but a useful start, continue the discussion on
Myhealthskills or at our Masterclass on in November 2014.
‘A few comments on evaluation from
registered participants’ • ‘Tips on how we can evaluate and measure impact of our project and
ULR activity’
• ‘Practical ways of measuring success’
• ‘Value for money/ return on investment’
• ‘Practical ideas of how to evaluate current training within pharmacy
and also evaluation of projects upon completion’
• ‘Insight on how to evaluate the impact and effectiveness of delivery
models’
• ‘A better understanding of the tools which could be used to support
service redesign evaluation’
If you don’t evaluate…..
The importance of evaluation is perhaps even more clear when we
consider what happens if we don’t do it:
You don’t REALLY know:
• if your project worked – you may be convinced, that’s not the point!
• where it worked and where not, what’s the evidence?
• whether key stakeholders believe it worked
• where improvement occurred, was it due to the project or some other factors?
• whether the project should be replicated elsewhere, in whole or in part
• how much time and money we should invest in the next/similar project
You don’t collect the evidence to support future bids & budgets
Why does evaluation confuse or
make people fear it?
• Even if an evaluation hasn’t gone well, for instance you’ve not got the
evidence of the impact wanted its likely you’d have learned something
on the way. If only about possible indicators.
• If its not something you already do, start and improve on things from
today
Key practice points
• Engage in evaluation at the start of the project
• Be proportionate
• Clarify the project rationale – not just the skills case
evaluation assesses extent it’s met
• Do not overlook stakeholders – who benefits
• Ensure measures are clear & concise and material!
• Achieve a balance: output, outcome, impact measures
• Design tools that produce precise valid & reliable data
Start at the beginning!
• Always try and start evaluation at the outset of the project this
is very useful because. Advantages include;
- You are clear on the case for change. You can evaluate that,
you are also more likely to be evaluating the original aims as
opposed to newly found aims (policy drift).
- You know what key measures are (more likely to manage &
achieve)
- Some data can be captured as project proceeds (efficiency)
- You can set expectations for the work
Be proportionate
• People get very excited about their projects and they
often believe they will be able to change everything
and measure that change
• Budget within project bid 2-5% of overall budget
Return and review the Project
Rationale – satisfy yourself as the
evaluator
What’s the
problem?
Monitor inputs
& activities
Assess impact
What causes the problem?
Develop options
which is best?
Objectives
& measures
Assess outputs &
outcomes
How do I best
do this?
Start here
The last afternoon test – think about the
stakeholders whose lives and work you
might wish to change
• On the last afternoon, you wrapped up your project!
...you are walking away from the project, what will success look like?
• Tangibly, on the ground and to whose benefit – think about the
stakeholders you will have, who’ll stand to benefit most
• Then decide how to measure that - work backwards from the effect
intended
Ensure measures are clear & concise
and material!
• Keep it material to what the project is trying to improve
• When identifying the few key measures of success:
Don’t get hung up on the activity
Measures of success
Inputs Resources (usually time and money) to produce
the intended benefits
Outputs What the inputs buy – in role redesign, this can be
a new role description, a learning & knowledge
profile for the role, pilot staff trained & in role
Outcomes In role redesign – usually the closing of a skills
gap and the application of new skills
Impact The service improvement we are trying to bring
about – problem solved?
Measures of success– some skills
examples
Inputs Budgets
Funding
Contribution in kind
Time
Output Outputs are usually about delivering the volumes
of agreed/contracted activity – usually do not
describe the results from that activity
Numbers of employers assisted
Number of TNAs (or ILPs) undertaken
Number of learners completing
Outcome Outcomes are results achieved by the project,
usually short to medium term
Skill gaps closed in participating employers
Employee performance enhanced
Employer attitudes: Further investment in skills?
Impact Is the most far-reaching effect which tackles the
fundamental problem(s) identified in the business
case
The service improvement targeted: saving
professional time; reducing agency costs;
enhanced patient satisfaction; staff job
satisfaction
The balanced scorecard
• The balanced scorecard is a
prompt to consider different key
perspectives
• Service delivery and clinical
outcomes tend to be ‘fact’ based
• Patient and staff satisfaction tend to
be ‘feelings’ based
• Is there an over-riding impact
measure…or, a range of outcome
measures that are of broadly equal
‘value’ across these 4 categories?
Service
delivery
Patient
experience
Clinical
outcomes
Benefits to
staff
Selecting your impact assessment method
3 approaches are most common:
• single impact measure (quantified)
• multiple impact measures (quantified)
• balanced scorecard measures (quantitative & qualitative)
You choose the best method as a judgment
That judgment is based on the project’s rationale:
Its stated aims & objectives
Other themes to be aware of – attribution
and benefit lag
• Partnership/collaboration common in the sector
• Multiple factors can affect service improvement
Attribution….how do I understand the effects of MY project?
The project logic chain helps: if activities are quality, outputs
delivered, outcomes are as expected, then impact….
Given benefit… considering the ‘reference case’ should factor
out most ‘non project’ effects
Benefit lag….impact can lag project activity significantly
Design evaluation with timescales in mind
What is the earliest point I can sensibly extrapolate end benefit?
Other themes to be aware of - Given benefit &
unintended consequences
What level of benefits would happen anyway without my project
(Given benefit)
Will my project reduce existing activity within the target group or area?
(Unintended, usually negative, consequences)
Consider example 1: What if………
20% of those professional hours would have been saved anyway by a
prior initiative?
And what if….
In order to cover all professional & support roles we then have to spend 10%
of the saving on agency staff at peak times?
Evaluation tools – some hints & tips
Interviews
• Compared to mainstream research, evaluation interviews:
- tend to demonstrate a wider divergence of views
- which must be reflected in conclusions
- in assessing a range of opinions ask (nicely!) ‘where’s the evidence?’
Process mapping
• Can be a very useful tool in process evaluation
• HOW was the project designed and delivered?
• Often explains WHY the project achieved the results it did
• Expertise is important though – eg. process critique/process design skills
Evaluation tools – more hints & tips
Data analysis
• Key to achieve a quantitative/qualitative balance of findings
• Without context, data can be difficult to analyse:
- what was the baseline or baselines?
- often need to get in early to identify baseline data
- otherwise project’s added value is difficult to evidence
- projects can over-emphasise outputs (the contracted work)
- understand point at which evaluation emphasis shifts from
activity & output assessment to outcome & impact assessment
Benchmarking
- Is often under-used; a comparative picture is often informative
- Under-used in setting targets (eg in proposals!) & analysis
Pitfalls to avoid
• Objectives become creating or refining a role : roles & skills are a means
NOT an end!
• Evaluation not understood - happy sheets; input focused
• ‘Evaluation comes at the end’ mindset
- don’t measure as we go
- opportunities/problems missed
- don’t set up baseline, can’t measure the added value
• Business case is flawed: Problem/Causes/Best option logic clear?
• A plausible case for need & demand?
• Project (only) focus – missing the strategic/leverage effects
Summary of good practice points
• Engage in evaluation at the start of the project
• Be proportionate
• Clarify the project rationale – not just the skills case
• evaluation assesses extent it’s met
• Do not overlook stakeholders – who benefits
• Ensure measures are clear & concise and material!
• Achieve a balance: output, outcome, impact measures
• Design tools that produce precise valid & reliable data
• http://www.hsj.co.uk/home/commissioning/where-is-the-evidence-for-
promoting-integrated-care/5067408.article