PwC Review

232
Independent review of the Health Service Performance Report and the Performance Management Report Final Report Version 3 www.pwc.com.au 4 May 2015

Transcript of PwC Review

Page 1: PwC Review

Independent review of the Health Service Performance Report and the Performance Management Report Final Report Version 3

www.pwc.com.au

4 May 2015

Page 2: PwC Review

“The essence of performance management is to ensure everyone is doing what the organisation needs them to do.”

Source: Six insights into Performance Management best practice

Page 3: PwC Review

PwC

Executive summary

Page

26 Section 3 HSPR & PMR overview

An overview of the existing HSPR and PMR structure and content, including what works well, is provided.

23 Page

Section 2 Performance management overview

The purpose and benefits of performance management including the overarching principles of PMF.

Structure of the report

Page 3

April 2015 Independent review of the HSPR and PMR

The diagram below explains the content of each section of this report.

Detailed review findings

33

Page

Section 4 Performance indicators

The suitability of the suite of performance indicators, the calculation methodology and the approach to target setting.

64

Page

Section 5 Performance scoring

The statistical methodology used to score and rate Health Services, the degree to which the calculation is clear and transparent, and the appropriateness of using a performance score to assess Health Service performance.

123

Page

Section 7 Performance monitoring

The robustness and suitability of the current Performance Monitoring and Evaluation (PME) processes.

88

Page

Section 6 Reporting structure and content

The structure and content of the HSPR and PMR,, including their logical flow and useability, and the benefits and relevance of producing both in the current reporting structure.

Whilst the review acknowledges what is working well, it focusing on identifying the

key issues that could be improved.

x Page

Appendices Appendices The appendices provide further detail to support the review.

6

Page Executive summary

This section provides a summary of the review, identifying

the following:

• Overall approach taken

• What works well

• Assessment criteria rating

• Summary of the key issues

• Summary of the recommendations for improvement

• Roadmap for improvements

17

Page

Section 1 Approach to review

The approach taken to conduct the independent review.

146

Page 4: PwC Review

PwC

Contents

April 2015 Independent review of HSPR and PMR

Page 4

Page 5: PwC Review

PwC

Section Content Link Page

Executive summary Executive summary 6

Section 1 Approach to review Approach to review 17

Section 2 Performance management overview Performance management overview 23

Section 3 HSPR and PMR overview HSPR and PMR overview 26

Independent review findings:

Section 4 Performance indicators Focus area 1: Performance indicators 33

Section 5 Performance scoring Focus area: Performance scoring 64

Section 6 Reporting structure and content Reporting structure and content 88

Section 7 Performance monitoring and evaluation Performance monitoring and evaluation 123

Appendices

Appendices Appendix A: WA Performance management overview Appendix A 147

Appendix B: Desktop review – PI additional analysis Appendix B 150

Appendix C: Stakeholder and interview attendee list Appendix C 177

Appendix D: Stakeholder workshop scoring Appendix D 180

Appendix E: Stakeholder workshop feedback Appendix E 182

Appendix F: Assessment criteria Appendix F 194

Appendix G: Supporting leading practice research Appendix G 208

Appendix H: Suggested HSPR / PMR combined report structure overview Appendix H 215

Appendix I: Glossary Appendix I 224

Appendix J: Document list and source data Appendix J 226

Appendix K: Commentary on recent DoH Safety and Quality Indicator review

Appendix K 230

Contents

Page 5

April 2015 Independent review of the HSPR and PMR

Page 6: PwC Review

PwC

Executive summary

April 2015 Independent review of HSPR and PMR

Page 6

Page 7: PwC Review

PwC

Context and the review scope WA Health engaged PwC to undertake an independent review of the Health Service Performance Report and the Performance Management Report.

Page 7

April 2015

Context

• WA Health introduced a Performance Management Framework (PMF) in 2010/11 to support the implementation of ABF/M. This approach was structured around a set of KPIs for each of the Health Services, which were reported in a Performance Management Report (PMR).

• The PMF and has developed over time, with recent changes including:

− a summary ‘Performance score’ for Health Services and facilities; and

− the introduction of a shorter Health Service Performance Report (HSPR).

• There is abundant evidence (national and international) which demonstrates how reporting and monitoring is essential to effectively managing and improving performance within a healthcare system.

• Reporting and monitoring is also integral in supporting relevant, accurate and positive interaction between the DoH, as System Manager, and Health Services, as the main service providers. This is essential as the Reform Program initiated by WA Health in late 2014 will lead to a fundamental changes in this relationship.

• WA Health recognised this challenge, and as such one of the key projects within the Reform Program will be to improve system performance management. Critical to this is ensuring that the approach to Health Service performance reporting and monitoring is fit for purpose.

Scope of review

PwC were engaged to undertake an independent review of the HSPR and PMR, and the overall Health Services Performance Monitoring and Evaluation (PME) processes.

The overall objective of the review was to assess the extent to which the HSPR / PMR Reporting and the PME processes are fit-for-purpose, both now, and in the future. As such, the review focused on identifying key issues that could be enhanced to better drive performance improvement in WA Health. The review’s scope covered the following four areas:

1. Performance indicators

The suitability of the suite of performance indicators, the calculation methodology and the approach to target setting. This assessment was carried out for both the PMR and the HSPR.

2. Performance scoring

The statistical methodology used to score and rate the facilities / Health Services, the degree to which the calculation is clear and transparent, and the appropriateness of using a performance score to assess Health Service performance.

3. HSPR and PMR reporting structure and content

The content and structure of the reports, including their logical flow and useability, and the benefits and relevance of producing both in the current reporting structure.

4. Performance monitoring & evaluation

The robustness and suitability of the current PME processes.

Independent review of the HSPR and PMR

Page 8: PwC Review

PwC

Assessment methods

Review areas (each with own set of assessment criteria)

Methodology of review The review covered four areas, and three assessment methods, to develop a set of key issues and recommended improvements.

Page 8

April 2015

1. Desktop review

HSPR & PMR

• Assessed the suite of performance indicators

• Assessed the performance indicator methodology

• Assessed the structure and content of the HSPR and PMR

3. Stakeholder workshops

Issues & Recommendations

2. Individual interviews

Performance Monitoring & Evaluation

• Assessed the performance score methodology

• Assessed the relationship between HSPR and PMR

• Assessed the performance monitoring process

DoH

Identifying key issues with the current HSPR and PMR, and the monitoring process

Combined DoH & Health Services

Consolidating the issues identified at the Health Services and DoH workshops, and starting to discuss potential improvements

Health Services

Identifying key issues with the current HSPR and PMR, and the monitoring process

Covered:

• The format of the HSPR and PMR, and the performance indicators included in each

• The Health Service performance monitoring and evaluation methodology (including the performance scoring methodology)

Reporting structure and content Performance Monitoring

and Evaluation (PME) Performance indicators (PIs) Performance scoring

Independent review of the HSPR and PMR

Page 9: PwC Review

PwC

HSPR and PMR - what works well Whilst the report focuses on identifying the key issues, the review has highlighted a number of aspects that work well in relation to the PMR, HSPR and PME (process).

Page 9

April 2015

The review identified a number of good practices which are summarised below, and were generally supported by stakeholder feedback:

1. The number of PIs measured within the PMR is reasonably comprehensive (although there are some significant gaps).

2. There are detailed descriptions of the rationale, data sources, reporting frequency and calculation methodology of the PIs – both for the PMR and the HSPR.

3. The suite of PIs within the PMR are largely aligned to national standards and targets.

4. The PMR allows users the ability to drill down further into each PI / rating, acting to inform the stakeholders further as to the performance of each facility.

5. Facility level performance pages in the HSPR and PMR allow peer-to-peer comparisons which stakeholders find useful.

6. The performance scoring methodology is simple, with the calculation clearly explained in the ‘Performance scoring methodology’ document, (albeit there is some lack of understanding in Health Services about how this has been calculated).

7. Regular meetings are held at a senior level (DG and Health Services Chief Executive) where performance is actively discussed. Actions arise from these meetings which are followed up.

8. The online portal for the PMR provides an interactive way to view the report and drill-down into specific areas and performance measures.

12. The restricted number of PIs in the HSPR does allow a focused discussion on performance to be conducted.

9. The detail included within the PMR is comprehensive, with a full profile given to each facility and not just Health Services.

10. The trend analysis depicted within the PMR is detailed and supported by supplementary information which helps explain the trends.

11. The HSPR dashboard in particular is visually impactful and draws attention to poor / high performance (compared to target and/or other Health Services).

Independent review of the HSPR and PMR

Page 10: PwC Review

PwC

1. Confirm strategic objectives of the Health Services, and the role of performance reporting and monitoring in relation to these objectives. See pages 56 – 57

2. Refine the suite of PIs (in the HSPR in particular) to sufficiently cover all key areas of performance, including (See pages 58 – 59):

a) Financial

b) Workforce

c) Patient experience

d) Safety and quality

e) Lead indicators.

3. Develop a clear process by which HSPR PI suites are regularly reviewed. See page 60

4. Clarify and communicate how each target and threshold (specifically in the HSPR) has been set. See page 61

5. Redesign data sourcing processes to ensure that all key HSPR indicators are reported with sufficient regularity and minimal lag (minimum of quarterly if possible). See page 62

6. Provide source data together with HSPR PI results. See page 62

1. There is limited alignment in both the HSPR and PMR reports between the PIs and WA Health strategic objectives and other key reporting requirements (e.g. the Annual Report). See pages 40 – 41

2. There is a lack of clarity on the specific rationale for setting some of the targets and thresholds within the HSPR See pages 42 – 44

3. The suite of PIs in the HSPR (and to a lesser extent the PMR) lack sufficient measures to comprehensively assess (some) key areas of performance. See pages 45 – 47

4. The PMR is not intended to measure Health Services’ performance exclusively (including for example metrics for HCN and HIN), and therefore contains metrics which are not relevant completely controllable by Health Services. See page 48

5. A significant number of PIs within the HSPR and PMR are not timely enough to effectively assess performance. See pages 49 – 50

6. There is significant difficulty in drilling down into the source data to investigate further into the performance results (particularly for the HSPR). See page 51

Performance scoring key issues Recommendations Performance indicators key issues

PIs - summary of issues and recommendations The review identified a number of issues in relation to the PIs, as summarised below, along with suggested recommendations for improvement.

Page 10

April 2015

For further information

refer to section 4.3

For further information

refer to section 4.5

Performance indicators

Independent review of the HSPR and PMR

Page 11: PwC Review

PwC

1. Remove the performance score as an overall measure of Health Service performance. See page 81

2. Agree an alternative approach to assessing Health Service performance, being either (See pages 82 – 84):

− Option A: an overall domain performance score methodology with interventions triggered by domain performance (See page 83), OR

− Option B: no performance score, with interventions triggered by specific indicator performance (See page 84). This review recommends Option B.

3. Reduce rating options to 3 (highly performing, performing, under performing) using traffic light ‘heat’ maps (red, yellow, green), which is standard throughout the HSPR and PMR. Clear definitions to be provided on what each rating means. See page 85

4. Remove the Health Service ranking pages from the HSPR. See page 86

5. Communicate methodology as to how performance rating and scoring is calculated. See page 87

Scoring - summary of issues and recommendations The review identified a number of issues in relation to the performance scoring, along with suggested recommendations for improvement.

.

Page 11

April 2015

Performance scoring key issues Recommendations

1. Utilising an overall performance score for a Health Service does not sufficiently distinguish between areas of good and poor performance. See page 71

2. There is limited understanding of how the overall performance score is calculated. See page 72

3. There is a high number of performance rating levels (4) for the HSPR and PMR. See page 73

4. There is a significant focus on comparing Health Services in the HSPR in particular which, given their differences, is not always meaningful. See pages 74 – 75

5. There is currently no weighting undertaken of PIs in determining the score, meaning that the relative importance of each domain is determined only by the numbers of indicators within that domain. See page 76

Performance scoring key issues

For further information

refer to section 5.3

For further information

refer to section 5.5

Performance scoring

Independent review of the HSPR and PMR

Page 12: PwC Review

PwC

Reporting - summary of issues and recommendations The review identified a number of issues in relation to the structure and content of the reports, along with suggested recommendations for improvement.

Page 12

April 2015

1. Inconsistent and fragmented structure of PME meetings

2. No defined accountability

3. Distribution of the HSPR report is unclear and undocumented

4. No mechanism for highlighting consistent underperforming or sustained performance improvements

Performance monitoring and evaluation key issues

1. Merge the HSPR and PMR into one report (Combined report), with the HSPR acting as an overview (Dashboard section) and the PMR as the detailed backup (Detailed section). See page 113

2. Determine the structure and reporting frequency, including PIs to be included. See pages 114 – 116

− Option a: Full monthly ‘Combined Report’

− Option b: Dashboard section is published monthly , quarterly Detailed section

− Option c: As per Option b, however certain PIs get reported more frequently if underperforming. This review recommends Option C.

3. Improve signposting and visual layout of the HSPR scorecard (targets / thresholds / variance, clarity on time periods, trends, etc..). See page 117

4. Combine the commentary with the main HSPR and PMR reports. See page 118

5. The Health Services write the HSPR commentary, to explain the drivers for performance and potential performance improvement strategies. See page 118

6. Include data drill down facilities. See pages 119 – 120

7. Include section on risk and mitigation strategies in both reports. See page 121

Recommendations

1. The HSPR does not give a clear indication of an organisation’s performance, because of (See pages 96 – 100):

• Confusing composite scoring of PIs

• Lack of trend analysis

• Lack of target, actual, variance

• Some confusing labelling and signposting

• Factual (rather than explanatory) commentary

• Commentary is produced separately.

2. The PMR does not give a clear indication of an organisations performance, because of (See pages 101 – 106):

• scorecard structure complexity (high volume of information on each page)

• Some confusing labelling and signposting.

3. There is a lack of alignment between the two reports. See page 107

4. There is no risk section in either report. See page 108

Reporting structure and content key issues

For further information

refer to section 6.5

For further information

refer to section 6.3

Reporting structure and content

Independent review of the HSPR and PMR

Page 13: PwC Review

PwC

PME - summary of issues and recommendations The review identified a number of issues in relation to the PME, as summarised below, along with suggested recommendations for improvement.

Page 13

April 2015

1. Inconsistent and fragmented structure of PME meetings

2. No defined accountability

3. Distribution of the HSPR report is unclear and undocumented

4. No mechanism for highlighting consistent underperforming or sustained performance improvements

Performance monitoring and evaluation key issues

1. Implement a structured and consistent approach to discuss / address underperforming areas within PME meetings. See pages 139

2. Clarify and document roles and responsibilities in the PME process. See pages 140 – 141

3. Clarify (and expand) stakeholder lists and distribution channels for HSPR and PMR. See page 142

4. Develop an intervention system tied to domain composite score or specific indicator performance (depending on whether scoring option A or B chosen). See pages 143 – 144

Recommendations

1. Lack of clarity and documented structure of PME meetings. See page 130

2. Lack of clarity of roles and responsibilities between Health Services and DoH staff in the PME process. See page 131

3. The distribution list and process of the HSPR report is unclear and undocumented. See page 132

4. No documented mechanism for highlighting and addressing key areas of underperformance. See pages 133 – 134

PME key issues

For further information

refer to section 7.3

For further information

refer to section 7.5

Performance Monitoring and Evaluation (PME)

Independent review of the HSPR and PMR

Page 14: PwC Review

PwC

Roadmap for recommended improvements

Page 14

April 2015

Section Recommendation Urgency Compl-

exity Time-frame

Pe

rfo

rm

an

ce

in

dic

ato

rs

1

Confirm strategic objectives of the Health Services, and the role of performance reporting and monitoring in relation to these objectives

M H MT

2

Refine suite of PIs (in the HSPR in particular) to sufficiently cover all key areas of performance

H H ST

3 Develop a clear process by which HSPR PI suites are regularly reviewed

M M MT

4

Clarify and communicate how each target and threshold (specifically in the HSPR) has been set

H M ST

5

Redesign data sourcing processes to ensure that all key HSPR indicators are reported with sufficient regularity and minimal lag (minimum of quarterly for both)

M/H (depends

on PI) H

MT/LT (address higher priority PIs first)

6 Provide source data together with HSPR PI results

H M ST

Section Recommendation Urgency Compl-

exity Time-frame

Pe

rfo

rm

an

ce

sc

or

ing

1 Remove the performance score as an overall measure of Health Service performance.

H L ST

2 Agree an alternative approach to assessing Health Service performance

H M ST

3

Reduce rating options to 3 (highly performing, performing, under performing) using a traffic light ‘heat’ maps (red, yellow, green), which is standard throughout the HSPR and PMR. Clear definitions to be provided on what each rating means

M L ST

4 Remove the Health Service ranking pages from the HSPR

M L ST

5

Communicate methodology as to how performance rating and scoring is calculated

H M

MT

(after R2)

Suggested implementation timeframes for each recommendation are shown below – meeting these timeframes will depend on the availability of resources.

Short term (ST): 0 - 4 months Medium term (MT): 4 – 12months Long term (LT): 1 – 2 years

Independent review of the HSPR and PMR

Page 15: PwC Review

PwC

Roadmap for recommended improvements (con’t)

Page 15

April 2015

Section Recommendation Urgency Compl-

exity Time-frame

Re

po

rt

str

uc

tur

e a

nd

co

nte

nt

1

Merge the HSPR and PMR into one report (Combined report), with the HSPR acting as an overview (Dashboard section) and the PMR as the detailed backup (Detailed section)

H H

ST (dash) /

MT (detail)

2 Determine the frequency of detail of reporting of different levels of indicators

H M ST

3 Improve signposting and visual layout of the HSPR scorecard

H H ST

4 Combine the commentary with the main HSPR and PMR reports

H M ST

5

Health Services write the HSPR commentary, to explain the drivers for performance and potential performance improvement strategies

H L MT

6 Include data drill down facilities (particularly from HSPR to PMR)

M M MT

7 Include section on risk and mitigation strategies in both reports

M H MT

Section Recommendation Urgency Compl-

exity Time-frame

Pe

rfo

rm

an

ce

mo

nit

or

ing

an

d e

va

lua

tio

n

1

Implement a structured and consistent approach to discuss / address underperforming areas within PME meetings

H M ST

2 Clarify and document roles and responsibilities in the PME process

H M ST

3

Clarify (and expand) stakeholder lists and distribution channels for HSPR and PMR

M L ST

4

Develop an intervention system tied to domain composite score or specific indicator performance (depending on whether scoring option A or B chosen)

H M MT

Suggested implementation timeframes for each recommendation are shown below – meeting these timeframes will depend on the availability of resources.

Short term (ST): 0 - 4 months Medium term (MT): 4 – 12months Long term (LT): 1 – 2 years

Independent review of the HSPR and PMR

Page 16: PwC Review

“The challenge for healthcare providers in Australia today is balancing clinical performance with excellent customer service and regulatory compliance in an ABF/M environment.”

Source: Adapted from the WHO Performance Measurement for Health

System Improvement Report: Experiences, Challenges and Prospects

Page 17: PwC Review

PwC

Approach to review

April 2015 Independent review of HSPR and PMR

Page 17

Page 18: PwC Review

PwC

Focus area 2: Performance scoring

Focus area 3: Reporting structure and content

Focus area 4: PME

Focus area1: PIs

Approach to review A four stage approach was undertaken for each of the four focus areas to identify key issues and develop robust recommendations for improvement.

Deliverables

Detailed report which outlines, for all focus areas:

― Approach taken

― Key issues / opportunities for improvement

― What works well

― Stakeholder feedback

― Recommended improvements

― Implementation plan / roadmap

4. Recommend

Collaboratively develop robust improvement recommendations

a) Develop recommendations for change with key stakeholders

b) Develop high level implementation plan

c) Draft report detailing review findings and recommended changes

d) Review and finalise report

3. Root cause analysis

Investigate the drivers behind performance issues

a) Consolidate and prioritise issues identified from the desktop review, interviews and workshops

b) Develop root cause hypotheses on the issues identified

c) Validate hypotheses with key stakeholders

(see page 14 for further information)

2. Assess

Evaluate current performance and identify issues

Conduct three complementary assessments to identify potential issues:

a) Independent Desktop Review

b) Stakeholder feedback workshops

c) One-on-one senior stakeholder interviews

1. Mobilise

To ensure the project is effectively established

a) Sign off detailed scope

b) Validate assessment criteria

c) Agree key stakeholders

d) Issue initial project communications

e) Issue information request, schedule key meetings and workshops

3 weeks 2 weeks

Page 18

April 2015 Independent review of the HSPR and PMR

Page 19: PwC Review

PwC

Assessment methods

Review areas (each with own set of assessment criteria)

Methodology of review The review covered four areas, and three assessment methods, to develop a set of key issues and recommended improvements.

Page 19

April 2015

1. Desktop review

HSPR & PMR

• Assessed the suite of performance indicators

• Assessed the performance indicator methodology

• Assessed the structure and content of the HSPR and PMR

3. Stakeholder workshops

Issues & Recommendations

2. Individual interviews

Performance Monitoring & Evaluation

• Assessed the performance score methodology

• Assessed the relationship between HSPR and PMR

• Assessed the performance monitoring process

DoH

Identifying key issues with the current Health Service Performance Reports and the monitoring process

Combined DoH & Health Services

Consolidating the issues identified at the Health Services and DoH workshops, and starting to discuss potential improvements

Health Services

Identifying key issues with the current Health Service Performance Reports and the monitoring process

Covered:

• The format of the HSPR and PMR, and the performance indicators included in each

• The Health Service performance monitoring and evaluation methodology (including the performance scoring methodology)

Reporting structure and content

Performance Monitoring and Evaluation (PME)

Performance indicators (PIs) Performance scoring and rating

Independent review of the HSPR and PMR

Page 20: PwC Review

PwC

Overall assessment aims: • 29 assessment criteria were developed and agreed with the project sponsors, to cover the 4 key focus areas.

• The criteria were aligned to the “overall assessment aims” (see right) which were derived from the scope documentation.

• The criteria were based on leading practice principles, in addition to performance management guidelines, and national and international comparison.

• The assessment criteria were used in the following ways:

1. To inform the desktop independent review

2. To develop key questions set for the individual interviews and stakeholder workshops which formed the basis of discussions

3. To identify key issues and recommendations for improvement

The full set of assessment criteria and the findings against each are presented in Appendix F.

Assessment criteria The review set 29 assessment criteria based on leading practice research. These were used to identify key issues within the four focus areas.

Page 20

April 2015

• Structure – is the structure of the reports logical, and easy to understand and analyse

• Content – do the reports provide a complete and comprehensive representation of the performance

Reporting structure and content

• PME process – is the method of monitoring and evaluating performance appropriate and successful

PME

• Aligned – do the PIs align to the strategic objectives of the organisation

• Relevant – is the information relevant to the target audience and sufficiently detailed to inform decision making

• Timely – is the information readily available and accessible and does not delay decision making

• Accurate – is the information accurate and consistent to be a trustworthy indicator of performance

• Complete and comprehensive – does the information provide a complete and comprehensive representation of performance

Performance indicators

• Performance score – is the method of scoring performance appropriate and understandable

Performance scoring and rating

For a full list of Assessment criteria see Appendix F

Independent review of the HSPR and PMR

Page 21: PwC Review

PwC

Some features of good performance reporting / monitoring in other jurisdictions:

Documentation reviewed

The desktop review looked at a wide range of documentation relating to the four focus areas, including:

• The HSPR and PMR reports (including the accompanying commentary and summary reports)

• Relevant guidance and definitions

• Relevant processes and policies

Leading practice / other jurisdiction comparison

• The desktop review compared these reports, processes and policies to the following:

1. Literature on leading practice performance reporting / monitoring

2. Examples from other Australian states and UK jurisdictions

• The assessment criteria was used to highlight key issues and what worked well in the focus areas

Desktop review The desktop review assessed relevant reports, processes and policies against leading practice and examples from other jurisdictions.

Page 21

April 2015

Report structure / content Performance monitoring and evaluation (PME)

Performance indicators Performance scoring

New South Wales

• Simple ‘traffic light’ rating system used to indicate performance

• Clear commentary included with graphs detailing exceptional circumstances (‘Interpretations’)

• Most condensed report at 35 pages potentially increasing useability

• High degree of workforce engagement through ‘YourSay’

• Clear and well documented performance review process

Victoria

• Mature Performance assessment scoring (‘PAS’) methodology

• Visually well laid out reports that are easy to read with key messages highlighted

• Good practice in PME having implemented a process for addressing /escalating data issues that is embedded in the framework

• ‘Force majeure’ clause to deal with unforeseen data circumstances

Queensland

• Simple and well laid out PI table & detailed metrics to be measured

• Fewest PIs measured with limited administration burden

• Well defined roles and responsibility for escalating issues

• High degree of strategic alignment, with Government direction objectives flowing through to Agency level strategic objectives

UK

• Extensive range of indicators comprehensively covering all key areas of performance

• Clear performance reporting and monitoring requirements

• Timely collection and reporting on performance

• Alignment of indicators to outcomes and national strategic health priorities

Independent review of the HSPR and PMR

Page 22: PwC Review

PwC

Individual interviews and stakeholder workshops Stakeholder interviews and workshops were held to support the desktop review in identifying and prioritising key issues, and developing potential solutions.

Page 22

April 2015

1. Individual interviews

One-on-one interviews were conducted with the chief executive officers of each Health Service, and senior DoH stakeholders*.

The purpose of these interviews was to obtain views on the key issues / what works well in the four areas of focus.

2. Stakeholder workshops

Three workshops (DoH, Health Services and a combined workshop) were held with key staff* involved in the Health Service performance reporting and monitoring processes.

The workshops included the following three activities:

• Assessment criteria scoring – participants voted on the extent to which they agreed on 17 statements (based on the assessment criteria). This highlighted where the participants felt were the biggest areas of concern / good practice.

• Group discussion: issues and root causes – participants were asked to identify the key issues and root causes of the four focus areas.

• Group discussion: potential solutions – participants were asked to develop ideas for potential solutions to these issues which could form the basis for recommendations for improvement.

*For a full list of stakeholders who attended the workshops and interviews please refer to Appendix C

Approach to Stakeholder workshops

Independent review of the HSPR and PMR

Page 23: PwC Review

PwC

Performance management overview

April 2015 Independent review of HSPR and PMR

Page 23

Page 24: PwC Review

PwC

A key challenge for healthcare providers in Australia today is balancing clinical performance with excellent customer service and regulatory compliance in an ABF/M environment. Advanced healthcare systems are moving toward greater efficiency, transparency and accountability, and this trend will continue, particularly in fiscally-constrained environments.

A means to support high performance in this environment is to ensure the use of a strategic performance management framework that is appropriate, fit for purpose, and organisation specific. This should be used as a tool to drive improvement, but also as an aid to flag and monitor issues and risks of facilities that are under-performing.

Benefits of a Performance Management Framework

Performance management and reporting provides a foundation for planning and budgeting by ensuring information on past performance is available to guide priorities and highlight changes required for the future.

The Commonwealth Government’s report ‘Better Practice in Annual Performance Reporting’ (Better Practice Report) demonstrates the importance of a PMF as a driver of continuous improvement and organisational change. Through the implementation of an effective PMF, all stakeholders should know what they are expected to achieve and have the motivation and incentive to deliver.

WA Health’s Annual Performance Management Framework 2014-15 is aligned to the WA Health Strategic Intent 2010-15 and the National Health Reform Agenda. The PMF is designed to support WA Health’s vision to deliver healthier, longer and better quality lives for all Western Australians. The PMF provides the health care system with a common set of performance objectives and targets.

WA Health’s PMF Principles are in line with the purpose of PMFs as recommended in the Better Practice Report and by the Australian Healthcare and Hospitals Association. These principles are shown on the following page.

Purpose of performance management Good performance management can help achieve strategic objectives through regular reporting and monitoring of performance indicators.

Page 24

April 2015

Mission Strategic objectives Performance

indicators Performance targets Performance results

Planning

Execution

Independent review of the HSPR and PMR

Page 25: PwC Review

PwC

Effective performance management is a continual process of monitoring, observing and communicating with staff and key stakeholders to provide constructive and actionable feedback about their performance. When there is a negative gap, effective performance management puts in place agreed strategies to close the gap and improve performance to necessary standards.

PMF principles There are 7 guiding principles of performance management that are outlined within the WA PMF.

Transparency

Accountability

Recognition

Consistency

Integrated

Service improvement focus

Escalation processes

Clear and agreed performance targets and thresholds and well-defined intervention processes to address poor performance.

Clearly defined roles and responsibilities to deliver Health services at agreed standards and volumes.

Recognition and reward for performance that is sustained and outstanding.

Consistency with National and State health service delivery objectives, priority and outcomes.

A balanced approach that has clear linkages to clinical planning, budget activity and safety and quality priorities.

A strong focus on mechanisms to enhance service delivery and health care outcomes.

Well-defined escalation processes and recovery pathways for performance concerns.

1

2

3

4

5

6

7

Page 25

April 2015 Independent review of the HSPR and PMR

Page 26: PwC Review

PwC

HSPR and PMR overview

April 2015 Independent review of HSPR and PMR

Page 26

Page 27: PwC Review

PwC

PMR overview Traditionally, the PMR has been the main report of the Health Services.

Page 27

April 2015

The PMR is produced quarterly, with two components published and distributed to various stakeholders.

1. PMR Full Report

The full report (circa 165 pages long) shows the full scope of data collected across the 57 PIs. This takes the form of a visual aid, and assesses each Health Service provider with a scorecard. The report then drills down into the results of each PI for each facility within that category. No commentary is included within this report, with the exception of noted issues impacting on reporting an indicator.

The report is accessible via an internal web portal, but requires user familiarity to access information and generate relevant reports.

2. PMR Summary Report

The summary report (circa 23 pages) is prepared for the Director General only. It is used to summarise the findings of the month, and to depict the ‘performance score’ of each facility and Health Service. It is the only report where this score’ is shown.

The report provides comparison for all the Health Services performance scores visually, and details the targets and thresholds of each indicator measured. Individual facility scores are shown visually through the scorecard, with no further drill down available within this report.

PMR - scorecard and detailed report example

PMR Summary Report

Further information in relation to the DoH (WA) PMF and performance management reports can be found in Appendix A

Independent review of the HSPR and PMR

Page 28: PwC Review

PwC

HSPR overview The Health Service Performance Report (HSPR) was introduced in August 2014 to focus on a core set of PIs that facilitate inter- and intra- Service comparison.

Page 28

April 2015

The HSPR was first published in August 2014, designed to be the discussion tool of the PMR. The aim was to provide more targeted and timely information and analysis, however access and distribution methods are not clear.

1. HSPR

The HSPR is published monthly, and comprises of:

• A summary page which shows overall Health Service performance against targets and thresholds for 14 PIs, where the results of each Health Service are shown side by side

• Separate tables which rank the Health Services for each PI

• More detailed pages which show each facility’s performance against each PI

The HSPR previously incorporated the Performance score, but inclusion has been ceased pending a review of the Performance score methodology.

2. HSPR Commentary

The HSPR Commentary is a separate document that accompanies the HSPR. The commentary is largely factual, and does not attempt to explain the reasons behind the performance results.

The HSPR Commentary also includes additional workforce information (FTE totals and movements) that is not included in the main HSPR.

HSPR

HSPR Commentary

Further information in relation to the DoH (WA) PMF and performance management reports can be found in Appendix A

Independent review of the HSPR and PMR

Page 29: PwC Review

PwC

1. Performance rating

Performance measurements Various ratings and scorings are presented for Health Service and facility performance. These differ between the HSPR and PMR.

Page 29

April 2015

Performance rating

Heat map

Highly performing

Performing

Underperforming

Not performing

Out of scope

Not calculable

Numbers withheld

Not available

A performance rating is given to each facility / Health Service dependent on how their measurement fits into a threshold. There are four possible ratings, as depicted to the right (with additional colour codes given for out of scope etc.). These ratings are then given a score which is used to calculate the overall ‘Performance score’ (discussed below). The possible ratings and applicable scores are: • Highly Performing=4 • Performing=3 • Under Performing=2 • Not Performing= 0 (this is to ensure that very low performance is not rewarded in calculation)

2. Performance scoring

Further information in relation to the DoH (WA) PMF and performance management reports can be found in Appendix A

Indicator Max

Rating Actual Rating

Facility 1

PI 1 4 4

PI 2 3 0

PI 3 4 3

Total 11 7

Facility 1 performance score (7/11*100) 63.6

Facility 2

PI 1 4 2

PI 2 3 2

Total 7 4

Facility 2 performance score (7/4*100) 57.1

Health Service performance score ((7 +4) / (11 + 7) *100)

61.1

Independent review of the HSPR and PMR

• The Performance score is calculated for each facility and Health Service by converting the performance ratings for each PI into a numeric value (this is the numerator). The maximum possible numeric value (which reflects the best possible performance on all indicators) is also calculated (this is the denominator). The Performance score represents the numerator divided by the denominator, multiplied by 100. The simple example shown to the right shows Facility 1 to have a score of 63.6.

• The Health Service performance score represents the average performance result across all of the facilities within that organisation. The simple example shown to the right shows a Health Service with a score of 61.1.

• Where a rating of ‘OS’ (out of scope), ‘NW’ (numbers withheld), ‘NC’ (not calculable) or ‘NA’ (not available) occurs for an indicator, the indicator is excluded from the calculation of the Performance score. Each facility and Health Service will have a varying number of indicators that make up the Performance score, because the scope of each indicator varies.

Page 30: PwC Review

PwC

Various ratings and scorings are presented for each Health Service and facility performance. These differ between the HSPR and PMR.

Page 30

April 2015

In addition to the two main methods for rating performance discussed on the previous page, there are additional metrics assessed that indicate the level of performance for each facility / Health Service, as discussed below:

3. Internal trends

The scorecard shown within the PMR and PMR summary report also provides a high level internal trend rating, to demonstrate movement in facilities’ performance over the past month.

4. Health Service provider ranking and performance rating

Whilst the ‘performance rating’ key and scoring method is used within the HSPR to illustrate performance against the 14 PIs included within this report, an additional scoring mechanism is included.

This ranks each Health Service provider when compared to each peer for that chosen PI, before defining it as: • At or Above Target • Below Target This rating is not depicted in any other reporting tool within the PMF.

Performance measurements (con’t)

Further information in relation to the DoH (WA) PMF and performance management reports can be found in Appendix A

Independent review of the HSPR and PMR

Page 31: PwC Review

PwC

Detailed review findings

April 2015 Independent review of HSPR and PMR

Page 31

Page 32: PwC Review

PwC

Assessment criteria summary The main findings of the review arose from the analysis against the assessment criteria below. Detailed analysis on each assessment criteria is in Appendix F.

Page 32

April 2015

Focus area Assessment criteria

PMR /HSPR meets

criteria

Performance indicators

Performance indicators (PI) measure the achievement of the organisation’s strategic objectives

u

Targets and thresholds are robust and align with national standards. Considerations include past performance, the performance of other similar authorities, and the resources available.

u

PIs measured are actionable with the results easily reproducible by the Health Services

u

The PIs and how they are calculated are sufficiently clear and easy to understand

4

The data source from which PIs are calculated are clear and accessible to Health Services, and capable of drill down for further analysis

5

Results against the PIs performance can be collected and reported in a timely manner

5

PIs measure recent performance with an appropriate level of frequency

5

The PIs statistical calculation methodologies are robust and not open to misinterpretation / miscalculation

u

PIs and how they are measured are comparable across organisations

4

The suite of PIs cover all necessary measures to assess that area of performance

5

The number of PIs is not overly excessive u

Focus area Assessment criteria

PMR /HSPR meets

criteria

Performance scoring

The Performance score uses statistical validity and an appropriate

weighting of performance for all contributing performance

indicators. u

The Performance score is calculated in a transparent and understandable manner

4

The Performance score is an appropriate measure of overall Health Service Performance

5

Reporting structure and content

Reports are structured in a logical way that facilitates clear understanding of the data presented

u

Reports appropriately highlight particular areas of strong / weak performance

u

The reports contain trends and forecasts of performance u

Sufficient and meaningful commentary accompanies the reports to present a full and balanced picture of performance

5

Reports identify the key risks to performance 5

The HSPR and PMR satisfy sufficiently different user needs u

The reports align to the organisational and governance structures of Health Services

u

PME

PME processes are streamlined and efficient u

There is sufficient time in the PME process for all parties to appropriately review performance

u

Appropriate individuals are involved in the PME process 4

Sufficient checks are performed to validate the accuracy of reported performance

u

Roles and responsibilities in PME process are clearly defined 5

PME meetings cover all key elements required to discuss and assess performance

5

There is clear accountability for who is going to take actions arising from the PME process

u

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

The detail behind the ratings are shown in Appendix F

Independent review of the HSPR and PMR

Page 33: PwC Review

PwC

Focus area 1: Performance indicators

April 2015 Independent review of HSPR and PMR

Page 33

Page 34: PwC Review

PwC

Section Content Pages

Section 4

Performance indicators 33

4.1 What works well 35

4.2 Summary of issues and recommendations 37

4.3 Desktop review analysis 39

Key issue 1:There is limited alignment in both reports between the PIs and WA Health strategic objectives and other key reporting requirements (e.g. the Annual Report).

40 – 41

Key issue 2: There is some lack of clarity on the specific rationale for setting some of the targets and thresholds.

42 – 44

Key issue 3: The suite of PIs in the HSPR (and to a lesser extent the PMR) lack sufficient measures to comprehensively assess some key areas of performance.

45 – 47

Key issue 4: Although comprehensive, the number of PMR PIs is higher than most jurisdictions, and some may not be essential to include in this report.

48

Key issue 5: A significant number of PIs within the HSPR and PMR are reported too infrequently and/or with a time lag that is too long to effectively assess performance.

50

Key issue 6: There is significant difficulty in drilling down into the source data to investigate further into the performance results (particularly for the HSPR).

51

4.4 Stakeholder feedback 52

4.5 Recommendations 55

Structure of this section

Page 34

April 2015

Performance indicators

Independent review of the HSPR and PMR

Page 35: PwC Review

4.1

What works well

Performance indicators

Page 36: PwC Review

PwC

Performance indicators - what works well Whilst the report focuses on identifying the key issues, the review has highlighted a number of aspects that work well in relation to performance indicators.

.

Page 36

April 2015

1. The number of PIs measured within the PMR is reasonably comprehensive (although there are some significant gaps).

2. There are detailed descriptions of the rationale, data sources, reporting frequency and calculation methodology of the PIs – both for the PMR and the HSPR.

3. The suite of PIs within the PMR largely align to national standards, allowing facilities and Health Services to understand what standards they should be delivered to and against.

4. The trend analysis depicted within the PMR is detailed and supported by supplementary information which helps explain the trends.

5. The PMR allows users the ability to drill down further into each PI / rating.

6. Feedback indicated that stakeholders liked that the PI definitions and reporting level / frequency that were listed within the PMR.

Performance indicators

Independent review of the HSPR and PMR

Page 37: PwC Review

4.2

Performance indicators - summary of key issues and recommendations

Performance indicators

Page 38: PwC Review

PwC

1. Confirm strategic objectives of the Health Services, and the role of performance reporting and monitoring in relation to these objectives. See pages 56 – 57

2. Refine the suite of PIs (in the HSPR in particular) to sufficiently cover all key areas of performance, including (See pages 58 – 59):

a) Financial

b) Workforce

c) Patient experience

d) Safety and quality

e) Lead indicators.

3. Develop a clear process by which HSPR PI suites are regularly reviewed. See page 60

4. Clarify and communicate how each target and threshold (specifically in the HSPR) has been set. See page 61

5. Redesign data sourcing processes to ensure that all key HSPR indicators are reported with sufficient regularity and minimal lag (minimum of quarterly if possible). See page 62

6. Provide source data together with HSPR PI results. See page 62

1. There is limited alignment in both the HSPR and PMR reports between the PIs and WA Health strategic objectives and other key reporting requirements (e.g. the Annual Report). See pages 40 – 41

2. There is a lack of clarity on the specific rationale for setting some of the targets and thresholds within the HSPR. See pages 42 – 44

3. The suite of PIs in the HSPR (and to a lesser extent the PMR) lack sufficient measures to comprehensively assess (some) key areas of performance. See pages 45 – 47

4. The PMR is not intended to measure Health Services’ performance exclusively (including for example metrics for HCN and HIN), and therefore contains metrics which are not relevant completely controllable by Health Services. See page 48

5. A significant number of PIs within the HSPR and PMR are not timely enough to effectively assess performance. See pages 49 – 50

6. There is significant difficulty in drilling down into the source data to investigate further into the performance results (particularly for the HSPR). See page 51

Performance scoring key issues Recommendations Performance indicators key issues

PIs - summary of issues and recommendations The review identified a number of issues in relation to the PIs, as summarised below, along with suggested recommendations for improvement.

Page 38

April 2015

For further information

refer to section 4.3

For further information

refer to section 4.5

Performance indicators

Independent review of the HSPR and PMR

Page 39: PwC Review

4.3

Desktop review analysis

Performance indicators

Page 40: PwC Review

PwC

• The Annual Report is the DoH’s main reporting requirement which is published to government and the public.

• There is only a very limited alignment between the PIs in the Annual Report with those in the HSPR and PMR.

• The consequence of this is that some key measures of performance which are reported externally are not being regularly tracked and discussed between DoH and Health Services.

• This may have resulted in wide number of additional reports which are prepared in relation to Health Service performance.

PI alignment in HSPR and PMR

• Only 7 of the PMR indicators align to PIs in the Annual Report.

• Only 2 of the HSPR indicators align to PIs in the Annual Report

• Some key PIs which are reported in the Annual Report are missing from the HSPR, including

− Some safety and quality metrics (e.g. survival rates, hospitalisation for preventable diseases)

− Patient experience / satisfaction

− Some key financial and efficiency metrics (e.g. net costs of services, staff costs / numbers)

It should be noted that there is currently a separate review being undertaken to assess the suite of indicators within the Annual Report, to determine whether they are the best measures of overall WA Health performance.

Key issue 1: Alignment to DoH Annual Report There is limited alignment between its PIs and those in the HSPR and PMR.

Page 40

April 2015

PMR Performance Measures Annual Report 2013-14 Performance Measures

ED attendances with LOE<= 4 hours

Proportion of emergency department visits completed in four hours or less

Percentage of population who are overweight or obese: a) Adults b) Children

Proportion of adults living with obesity

In hospital mortality rates (for acute myocardial infarction, stroke, fractured neck of femur & pneumonia)

Survival rates for sentinel conditions of privately managed public patients

Measures of patient experience (including satisfaction) with hospital services

Patient satisfaction (various)

Percentage of children fully immunised at 12-15 months: a) Aboriginal b) Total

Percentage of fully immunised children

YTD distance of net cost of service to budget

Actual net cost of service to budget targets for WA health

HSPR and PMR measures that are similar to those in the Annual Report:

HSPR Performance Measures

Annual Report 2013-14 Performance Measures

ED attendances with LOE<= 4 hours

Proportion of emergency department visits completed in four hours or less

Performance indicators

Independent review of the HSPR and PMR

Page 41: PwC Review

PwC

Strategic intent and the ‘four pillars’

• As outlined in the 2010-15 Strategic Intent, WA Health’s mission to improve, promote and protect the health of Western Australians is supported by the four Strategic Intent Pillars.

Most of the PIs in HSPR and PMR align to these strategic intent pillars

• 30 of the PMR PIs (53%) and 9 of the HSPR PIs (64%) align to one or more of the Four Pillars.

• However, there are 17 PMR PIs and 5 HSPR PIs which do not align with any of the missions outlined by the ‘Four Pillars’. These should be reviewed to discuss whether they are a true priority for regular performance reporting and monitoring.

Some significant areas of the Four Pillars are not covered by HSPR PIs

• Pillars 2 and 4, and a significant proportion of pillar 1 are not covered by any HSPR PIs.

• The consequence is that a high priority of strategic priorities are not being regularly reviewed at senior PME meetings between the DoH and Health Services.

Key issue 1: Alignment to ‘Four Pillars’ Although most HSPR and PMR PIs fall under the “Four Pillars”, there are some significant areas of the pillars which appear not to be covered by HSPR PIs.

Page 41

April 2015

Four Strategic Intent Pillars

Caring for individuals and the community

1

Caring for those who need

it most

2

Making best use of funds

and resources

3

Supporting our team

4

Pillar missions aligned to HSPR and PMR (only missions that align are shown)

Number of PMR PIs

aligned to mission

Number of HSPR PIs aligned to

mission

Pillar 1

Achieving our stated targets for inpatient, outpatient, ambulatory and elective surgery performance

27 9

Working hard to meet our health promotion, illness prevention and early intervention targets, such as:

(i) Making sure our children are screened and immunised for a healthy future (ii) Screening adults for common cancers which can be treated with early detection (iii) Encouraging individuals to make healthy choices to maximise their health and wellbeing (iv) Remaining at the forefront of international medical research by investing in infrastructure and building partnerships with industry and individual researchers (v) Providing comprehensive child development services for children across the State.

3

None

Pillar 2

Working hard to close the gap in health and wellbeing between Aboriginal and non-Aboriginal Australians

4 None

Promoting and improving the oral health of all Western Australians 2 None

Pillar 3

Developing and rolling out a system of activity-based funding and management for our Health Services and hospitals, providing a clearer link between the dollars we spend and the services we provide to patients and the community

4 1

Being open and accountable to the Western Australian Parliament and the public about the way we use the resources entrusted to us

11 5

Pillar 4

Putting in place a state-wide strategic workforce plan to help us make sure we have the right people to staff our hospitals and Health Services

2 None

Continuing to provide opportunities for staff to grow and advance through WA Health's pioneering graduate and leadership programs 2 None

Performance indicators

Independent review of the HSPR and PMR

Page 42: PwC Review

PwC

PMR Targets

HSPR Targets

• The percentage of targets aligned to national standards used in the HSPR and PMR (Jan 2015) is shown below. As the HSPR does not use targets consistent with the PMR, there is considerably less strategic alignment (only 64% compared to 77% - see below).

• Where there is no strategic alignment to national standards, targets have been developed by the Resourcing and Performance Division, in consultation with key WA Health directorates. However, the process used to set these targets is not clear.

Key issue 2: Target alignment to national standards The PMR PI targets are more aligned to national standards than the HSPR, with over a third of the PIs set by the Resourcing and Performance Division.

Page 42

April 2015

Targets with strategic alignment Targets set by PAQTargets aligned to national standards

Targets set by Resourcing and Performance

Additional supporting PI desktop review analysis can be found in Appendix B

36%

64%

Targets with strategic alignment Targets set by PAQTargets aligned to national standards

Targets set by Resourcing and Performance

Performance indicators

23%

77%

Independent review of the HSPR and PMR

Page 43: PwC Review

PwC

• There is a lack of synergy between the two reports that creates a fragmented picture of current performance.

• Performance targets utilised in the HSPR do not always clearly align with performance targets in the PMR (as shown in the table on the right). By having different targets in the HSPR and PMR, separate analyses and interpretations are required. This adds to the administrative burden of reporting and makes it difficult to track performance from the PMR to the HSPR. Furthermore, it becomes unclear as to which report is the more relevant measure.

• Additionally, the HSPR and PMR both list performance targets, but thresholds are only listed in the PMR report. As such, the HSPR requires users to source thresholds from the HSPR Indicator Overview manual, but this document is not easily accessible. This compromises the ‘openness and transparency’ principle of the PMF.

• There is some perception amongst Health Services’ stakeholders that the HSPR performance targets have been changed without notice and consultation – however this consultation did occur. This points to an issue around there being no clear, documented and communicated process for how these targets are reviewed and amended.

There is a lack of synergy between the HSPR and PMR PI targets that impairs consistency and creates multiple sources for performance ratings.

Page 43

April 2015

Description PMR target

HSPR target

EA1 Proportion of emergency department patients seen within recommended time (Note: HSPR lists this as percentage not seen within recommended time).

Cat 1 – 100% Cat 2 – 80% Cat 3 – 75%

76.67%

Eq3 Staphylococcus aureus bacteraemia infections

2 per 10,000 patient days

<= 1 per 10,000 occupied bed days

EQ7 Death in low-mortality DRGs National Peer Rate

100

Key issue 2: HSPR & PMR target consistency

Performance indicators

The HSPR commentary report shows PI targets and comments on the performance rating, but there is no indication of thresholds for each rating.

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 44: PwC Review

PwC

Key issue 2: lack of clarity of targets within HSPR There is a lack of clarity of how some targets within the HSPR are set, with the definitions manual often providing vague information.

Page 44

April 2015

Example from the HSPR • The targets within the HSPR at times differ from the PMR, as demonstrated previously. This creates confusion around which performance measures should be used, and a ‘single source of truth’ for Performance Ratings becomes lost due to the lack target of alignment.

• Furthermore, there is a lack of detail around how the targets for the HSPR have been set. The HSPR Indicator Overview manual lists the targets for each PI. However, the target source description is often vague, including descriptions such as ‘established for the 2014-15 HSPR’ with no further details provided (examples shown to the right).

• This is evident for 67% (12) of PIs listed in the manual.

• The remainder largely align to national standards.

• Some examples of the above are shown to the right for different PIs.

An unclear approach to target-setting can undermine its appropriateness as a realistic threshold for performance. It is important that Health Services clearly understand the methodology and be consulted to determine targets in a collaborative manner.

Performance indicators

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 45: PwC Review

PwC

Although positive feedback has been received from many stakeholders in relation to the streamlined nature of the HSPR, the number of indicators it tracks is less than international and national peer comparisons:

• In comparison to these peers, there are a number of key indicators of performance lacking in the following areas:

− *Safety & Quality e.g. unplanned returned to theatres and serious incidents investigated

− Patient experience – survey results, complaints

− Workforce – staff satisfaction, training / appraisals conducted

− Finance / efficiency – private patient revenue, length of stay

− Non hospital metrics – health outcomes/ long-term condition related metrics

− Proactive ‘lead as opposed to lag’ indicators

• Failure to sufficiently manage these indicators has been shown to a key driver of failing healthcare organisations, such as Mid Staffordshire (see right).

• In addition, some key financial metrics which are fundamental reporting requirements to government and the Public (including through the Annual Report), are missing from the HSPR.

• A number of these PIs are included in the PMR, however given the limited current usage of these as a performance tool, it is unclear how much these are being monitored between DoH and Health Services.

Some key PIs which are prominently tracked in other jurisdictions are missing from the HSPR.

Page 45

April 2015

Key issue 3: HSPR PIs

Mid Staffordshire NHS Foundation Trust Public Enquiry – Key Relevant Findings

The enquiry into the failings of Mid Staffordshire found a number of causes directly relevant when considering which directly related to organisation’s approach to performance management:

1. Focus on financial results as opposed to quality measures

2. Patient survey results not acted upon

3. Staff survey results not acted upon

4. Limited clinical risk management

5. Inadequate processes for dealing with complaints

6. Inadequate tracking and risk assessment of staff reductions

7. Poor staff leadership, recruitment and training

Performance indicators

Independent review of the HSPR and PMR

*Additional Commentary on recent DoH S&Q Indicator can be found in Appendix K

Page 46: PwC Review

PwC

The following indicators are some which are tracked with prominence in other jurisdictions, however are not included in the HSPR.

Page 46

April 2015

Key issue 3: HSPR PIs (con’t)

Area Indicator type Example indicator descriptions

Safety & Quality

Ambulance handover

Handover time between Ambulance and ED

Safety & Quality

Incorrect procedures / complications

Incorrect procedures: Operating Theatre- resulting in death or major loss of function (number), number of complications arising from procedures

Safety & Quality

Mental health

Care Programme Approach (CPA): The percentage of Service Users under adult mental illness specialties on CPA who were followed up within 7 days of discharge from psychiatric in-patient care

Safety & Quality

Survival rates Survival rates for key conditions (e.g. stroke, cardiac)

Safety & Quality

Serious incidents reporting

Serious Incidents followed up within set timeframe

Safety & Quality

VTE risk assessment

Proportion of inpatient Service Users undergoing risk assessment for VTE

Patient Experience

Patient survey Satisfaction rates, participation rates

Patient Experience

Patient complaints

Number of complaints, number of complaints addressed within set timeframe

The Rationale for these having these indicators, and additional supporting PI desktop review analysis, can be found in Appendix B

Area Indicator type Example indicator descriptions

Finance / Efficiency

Capital spend Capital spend to plan, forecast capital spend to plan

Finance / Efficiency

Length of stay Overall length of stay, multi-day length of stay, daycase rate, pre-admission length of stay, day-of-surgery-admission rate

Finance / Efficiency

Net cost of services

Net cost of services to plan

Finance / Efficiency

Staff levels / costs

FTE to plan, staff costs to plan, Nursing hours per patient day

Finance / Efficiency

Savings schemes YTD savings delivered to plan

Finance / Efficiency

Forecast Forecast net cost of surplus to plan, forecast cost to plan, forecast revenue to plan, forecast savings to plan

Finance / Efficiency

Private patient revenue

Private patient revenue to plan

Workforce Staff appraisals Percentage of staff with annual appraisal / performance review

Workforce Staff survey Percentage of staff who would recommend organisation to staff and patients

Workforce Staff complaints Number of complaints, number of complaints addressed within set timeframe

Access Diagnostic tests Percentage of Service Users waiting less than 6 weeks from Referral for a diagnostic test

Performance indicators

Independent review of the HSPR and PMR

Page 47: PwC Review

PwC

Lead and lag indicators

• Lead indicators measure the likelihood of achieving a future target based on current performance, allowing organisations to proactively respond to risks. For example, patient satisfaction could be used as a lead indicator of staff turnover, as this is a contributing factor to staff retention.

• Lag indicators measure the achievement of performance after the target has been met/missed, meaning organisations must respond to risks reactively. For example, staff turnover is a lag indicator of staff retention.

• Leading practice suggests that the PMF should place a higher focus on lead indicators to drive improvement by proactively addressing risks to reduce the likelihood of an unmet target.

• Examples of lead indicators used in other jurisdictions include:

− VIC monitors the number of patients on the Elective Surgery Waiting List as a lead indicator of achieving NEST targets.

− NSW monitors June expenditure projection as a lead indicator of being on budget for YTD expenditure

− The NHS measures post-diagnosis care effectiveness as a

lead indicator of health-related quality of life

• Lead indicators are primarily around vaccination and immunisation rates, and health conditions that may lead to further complications (such as smoking and obesity).

• The full suite of PIs used in the PMR are primarily lag indicators, with a far lower proportion included within the HSPR.

Compared to other jurisdictions, the HSPR and PMR have a lower ratio of lead to lag indicators.

Page 47

April 2015

Key issue 3: PI gap analysis - lack of lead indicators

Performance improvements are driven by a mix of lead and lag indicators, which allow organisations to respond to risks proactively and reactively.

Performance indicators

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 48: PwC Review

PwC

The PMR has a number of essential PIs, however there are some which could be more appropriately reported elsewhere, to focus the report on the performance of Health Services.

The PMR does is not intended to measure Health Services’ performance, and thus contains metrics not relevant completely controllable by Health Services.

Page 48

April 2015

Key issue 4: PMR PIs

Dimension Indicator Commentary

Quality Hospital accreditation Although important, more of a governance requirement than a performance metric

Quality Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit

More of an Access target, and could incentivise poor behaviour

Inputs per Output

Average cost per test panel for PathWest Questionable how controllable by Health Services

Inputs per Output

School Dental Service ratio of examinations to enrolments Not a key metric typically prominently measured in other jurisdictions

Access Proportion of eligible population receiving dental services from subsidised dental programs by group

Not a key metric typically prominently measured in other jurisdictions

Workforce Proportion of medical graduates to medical staff Questionable how controllable by Health Services

Workforce Proportion of nursing graduates to nursing staff Questionable how controllable by Health Services

Finance Manually corrected payroll errors (underpayments) Questionable how controllable by Health Services

Finance Availability of Information Communication Technology (ICT) services: percentage of Service calls

Questionable how controllable by Health Services

Finance Patient fee debtors Questionable how controllable by Health Services

Finance NurseWest shifts filled A better metric (used by other states) could be spend on premium staff spend, as this focuses on the key issue of the cost of using expensive agency staff

Finance Accounts payable - payment within terms Questionable how controllable by Health Services

Performance indicators

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 49: PwC Review

PwC

Appropriateness of timeliness of PMR PIs

• While around half of the PIs are updated monthly (29 out of 57), the remaining PIs are updated less frequently, meaning large proportions of the report remain unchanged for significant periods of time, as shown to the right.

• This impacts the ability to identify and address concerning performance trends. This creates a significant performance risk to WA Health.

• In particular, only 1 of the 13 PMR PIs relating to quality effectiveness for WA Health Services is updated monthly.

• In comparison to NSW, WA Health is significantly less timely in the reporting of its key PIs.

− NSW for example updates 52% of PIs monthly, and only 10% annually (as opposed to 26% being updated annually in WA), as shown to the right.

PMR update frequency of PIs

NSW update frequency of PIs

Key issue 5: Timeliness of PMR PIs Nearly half of the PIs within the PMR are updated less than monthly, with 26% only measured annually, significantly less than most other jurisdictions.

Page 49

April 2015

51%

26%

19%

2% 2%

Monthly Annually Quarterly Tri-Annually Six Monthly

52%

31%

7%

10%

Monthly Quarterly Tri-Annually Annually

Performance indicators

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 50: PwC Review

PwC

Appropriateness of the suite of HSPR PIs

• While around half of the PIs are updated monthly, 6 out of 14 are only updated quarterly or annually, meaning large proportions of the report remain unchanged for significant periods of time.

• This impacts the ability to identify and address concerning performance trends. This creates a significant performance risk to WA Health.

• In particular, none of the four PIs relating to quality effectiveness for WA Health Services are updated monthly.

• The HSPR was introduced to streamline the number of PIs to focus on just the critical measures which senior DoH and Health Service staff should be discussing on a monthly basis. It is therefore essential that these PIs should be updated as frequently as possible.

Key issue 5: Timeliness of HSPR PIs Nearly half of the PIs within the HSPR are updated less than monthly, meaning a significant proportion of the report is unchanged each month.

Page 50

April 2015

HSPR Domains # of PIs Monthly data available

Access 5 80%

Quality 4 0%

Efficiency 5 80%

43% of indicators are reported included in the HSPR are reported less than monthly.

Performance ratings only change for less than 60% of indicators – meaning the information is largely outdated.

Most of the Access and Efficiency indicators are updated monthly. However, Quality scores would not change as there is no monthly data.

Performance indicators

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 51: PwC Review

PwC

Key issue 6: Availability of data There is significant difficulty in drilling down into the PI source data to investigate further into the performance results (particularly for the HSPR).

Page 51

April 2015

• Health Service performance data is provided to DoH using an online portal (for the PMR) or through Excel workbooks sent over email or a shared drive (for the HSPR). Data is provided by data custodians both externally and within the Health Services. These files contain the information required by DoH to assess performance against the PIs, unless data is omitted for the following reasons:

• As shown in the below, the data is then extracted to calculate Performance Ratings and perform quality assurance checks.

• Discussions with the Health Services highlighted data availability as a real issue. When the initial findings within the reports are circulated, Health Services are then required to undertake their own analysis to understand the rating in order to then justify it, and do not have access to externally provided data despite it being a measure of their own performance.

• At present, Health Services are unable to access the data used internally at DoH, having to seek a number of executive approvals for access. This takes a considerable amount of time, and as such, Health Services are limited by what performance ratings they can investigate.

Report 7th 10th 11th 12th 13th 14th 16th 17th Date varies Date varies

HSPRData custodians

submit data

Extract data for

report

Quality

assurance

Draft data

report

Data report reviewed

by CEs and DDG

Data report

sign-off

Commentary

report prepared

Commentary report tabled at

Budget Steering Committee

Report discused

at SHEF

PMRData custodians

submit data

Extract data for

report

Commentary and

quality assurance

Draft for review

by DG

Sign-off and

publish

Working Day of the Month - Deadline

Performance indicators

Independent review of the HSPR and PMR

Additional supporting PI desktop review analysis can be found in Appendix B

Page 52: PwC Review

4.4

Stakeholder feedback

Performance indicators

Page 53: PwC Review

PwC

Perception profiling – performance indicators The scoring profiles assessed within the workshops are shown below. This demonstrates a more positive view among DoH stakeholders than Health Services.

Page 53

April 2015

1.00

2.00

3.00

4.00

5.00

Within the PMR, the suite of PIsclearly align to the strategic

objectives of WA Health

Within the PMR, the PI’s are comprehensive, and do not leave

any significant area of performance unmeasured

Within the HSPR, the suite of PI’s clearly align to the strategic

objectives of WA Health

Within the HSPR, the PI’s are comprehensive, and do not leave

any significant area of performance unmeasured

The targets and thresholdswhich have been set align to

realistic and sufficientlychallenging expectations ofHealth Service performance

Data sources from which PIs arecalculated are clear and

accessible to Health Services,and capable of drill down for

further analysis

Data used is accurate andcomparable across organisations

Data used is as timely as it needsto be

DoHAverage score

Health ServicesAverage score

• Perception profiling can be used to demonstrate an overall view or ‘perception’ of a stakeholder group.

• The profile demonstrates a more positive opinion within the DoH workshop regarding the performance indicators, with an overall average score of 3.19 versus 2.29 for the Health Services.

• This is particularly true with regards to accuracy and availability of data, where the views of the group were widely negative. This was reflected in the comments received, with data access a key issue for all Health Services workshop attendees.

• Individually, DoH voted higher on every single questions compared to Health Services.

• However, in relation to realistic targets both workshops voted similarly, as neutrally aligned. This is perhaps indicative of a wider issue of limited understanding within the stakeholders of how targets are set and the process by which to challenge them.

• Of all the questions asked of stakeholders, the total average score was lowest in relation to the suite of PIs within the HSPR. Stakeholders views supported this opinion with most seeing them as too few to accurately assess performance.

Answer key:

It is noted that these views are indicative and only represent the views of the attendees. For full scoring profiles please refer to Appendix D

Performance indicators

Independent review of the HSPR and PMR

Page 54: PwC Review

PwC

PIs: Key issues identified by stakeholders The key issues raised at the workshops and the individual stakeholder interviews are shown below. Area Key issue Commentary

Performance Indictors

There is limited alignment to the organisation objectives and other key reports

• Unclear what the purpose /objectives of the different reports are

• There is limited alignment to the WA Health four strategic pillars and the Annual Report

There are some key performance measurement gaps in the suite of HSPR indicators

• Safety & Quality – key indicators missing such as incident investigation, patient experience, complication rates, ambulatory care

• Productivity – missing key indicators such as NHPPD, LOS, etc.

• Finance – need to align to governmental reporting (e.g. Expenditure/Revenue to budget, net cost of services, staffing costs, capital, etc.)

• Workforce – no indicators at all

• Non-hospital metrics – very few included (e.g. Mental Health, Ambulatory Care, Community, Prevention, Diagnostic, etc.)

Too much focus on lag rather than lead indicators, and no inclusion of risk

• Most of the indicators measure past performance rather than give an indication of future performance

• Risk (in particular clinical risk) is not reported in these reports, and is missing from other DoH/Health Service reports

Lack of alignment between HSPR and PMR indicators

• Some HSPR and PMR indicators are calculated differently, leading to confusion and lack of confidence in data

PI data has a significant time lag, and/or is reported quarterly/annually

• A number of the PIs measured within the PMR/HSPR are updated less than monthly meaning a large proportion of the reports’ content does not change

• Significant time lag on much of the PI data

• Much of the PMR data is collected monthly but only published quarterly

Unclear how some of the targets / thresholds have been set, and are unrealistic

• Although each target has a clear source in the definitions manual, this is sometimes vague, and not widely understood

• SABSI (half the national target) and 0 rate targets have been highlighted as unrealistic

Insufficient ability to drill down into the HSPR data

• Health Services find it difficult to drill into the data driving the results, and identify the causes

• Health Services do not receive the full suite of backing data

The methodology to calculate some of the PIs is confusing

• The calculation methodology of some indicators (such as the unit cost to price) is confusing – the finance indicators are particularly difficult to understand

Page 54

April 2015

Performance indicators

Independent review of the HSPR and PMR

Page 55: PwC Review

4.5

Recommendations

Performance indicators

Page 56: PwC Review

PwC

PIs - Recommendation 1 Confirm the goals and strategic objectives of the Health Services, and the role of performance reporting and monitoring in relation to these objectives.

Page 56

April 2015

Issue(s) addressed:

• There is limited alignment in both reports (principally the HSPR) between the PIs and WA Health strategic objectives and other key reporting requirements (e.g. the Annual Reports and the Operational plan metrics).

Key recommended steps:

1. With selected stakeholders from within DoH and Health Services, clarify the overarching goals of WA Health (this could be the Four Pillars).

2. Confirm or develop the specific strategies to achieve these goals, and agree the specific role of Health Services in doing so.

3. Agree the PIs necessary to track the achievement of these strategies (see Recommendation 2).

Recommendation benefits:

1. Ensures all priorities of WA Health, and the role of the Health Services in relation to these priorities, are clear and understood by all stakeholders.

2. Management focus is prioritised in the most important areas.

3. Helps determine the most important PIs to report on and monitor performance.

Goals What are you trying to achieve?

Performance indicators How do you measure success against goals and strategies?

Strategies How are you going to achieve them?

- PI 1 - PI 2 - PI 3

- PI 4 - PI 5 - PI 6

- PI 7 - PI 8 - PI 9

Strategy 1 Strategy

2 Strategy

3

Pillar 1 Pillar 2

For additional leading practice research please see Appendix G

Performance indicators

Independent review of the HSPR and PMR

Additional observations The ‘golden thread’ between strategic direction and PI is clearly demonstrated in this diagram presented in the Grains Research Development Corporation’s Annual Report. It clearly links the PIs to strategic objectives, policies and legal drivers.

Page 57: PwC Review

PwC

Additional observations – what the UK Outcomes Framework does well

• The UK Outcomes Framework is a leading practice example of how to align PIs to overarching objectives of a Health System.

• This helps ensure that the right indicators are being measured, and that all stakeholders understand why each measure needs to be reported, monitored and managed.

PIs - Recommendation 1 (con’t) Confirm the goals and strategic objectives of the Health Services, and the role of performance reporting and monitoring in relation to these objectives.

Page 57

April 2015

• The Outcomes Framework defines 5 overall strategic objectives which form the priority for the Health System:

• Within those objectives they then define the overarching indicators and improvement areas which require focus.

• These objectives and the indicators are reviewed and refined on an annual basis.

Performance indicators

Independent review of the HSPR and PMR

For additional leading practice research please see Appendix G

Page 58: PwC Review

PwC

PIs: Recommendation 2 Refine suite of PIs to sufficiently cover all key areas of performance, and align to strategic objectives and other key reporting requirements.

Page 58

April 2015

Issue(s) addressed:

• There is limited alignment in both reports between the PIs and WA Health strategic objectives and other key reporting requirements (e.g. the Annual Report).

• The suite of PIs in the HSPR (and to a lesser extent the PMR) lack sufficient measures to comprehensively assess some key areas of performance.

• The PMR does not exclusively measure Health Services’ performance (including for example metrics for HCN and HIN), and therefore contains metrics which are not relevant completely controllable by Health Services.

Key recommended steps:

1. Collate list of PIs from: existing PMR /HSPR; national and international examples; and other existing reports.

2. Separate the indicators into the domains.

3. Undertake an exercise with key DoH and Health Services staff to prioritise PIs for each domain, considering in particular:

a) Scale – extent to which the PI measures a large and critical area of business

b) Risk – extent to which the PI represents a significant risk if performance is not monitored regularly in this area

c) Alignment to strategic objectives

d) Alignment to other key reporting requirements

e) Lead / lag balance

f) Outcome / process balance

4. Assign PIs to different layers of the reporting hierarchy based on priority order (ensuring sufficient balance of all domains).

5. Determine whether data for each PI is of sufficient quality to be included in the reports(s), or whether it should be withheld until data sourcing issues have been addressed

6. Determine data sourcing plan (if required), including process to obtain data with sufficient frequency and to a sufficient quality standard (see recommendation 5, page 62) .

7. Sign-off PI list with key stakeholders (final word with FPPG).

Additional observations:

• Some key PIs which are not currently captured by the HSPR and PMR, but which have prominence in other jurisdictions, including additional PI s in the following domains should be considered (additional detail is shown in Appendix B):

− Safety & Quality e.g. unplanned returned to theatres and serious incidents investigated

− Patient experience – survey results, complaints

− Workforce – staff satisfaction, training /appraisals conducted

− Finance / efficiency – private patient revenue, length of stay

− Non hospital metrics – health outcomes/ long-term condition related metrics

• Some PIs which are currently reported in the PMR but which may be of lesser criticality, are shown on page 48.

• Existing and proposed new PIs should be SMART assessed to ensure appropriateness* and limit unnecessary duplication of additional Health Service internal performance reports.

Performance indicators

Independent review of the HSPR and PMR

*For additional leading practice research please see Appendix G

Page 59: PwC Review

PwC

leading practice example: The UK Trust Development Authority Accountability Framework example tracks a balanced set of indicators on a monthly basis:

PIs: Recommendation 2 (con’t) Refine suite of PIs to sufficiently cover all key areas of performance, and align to strategic objectives and other key reporting requirements.

Page 59

April 2015

• The TDA metrics cover a wide range of domains including Safety, Quality, Finance, Access, Patient Experience and Workforce.

• All metrics are reported monthly.

Performance indicators

Independent review of the HSPR and PMR

For additional leading practice research please see Appendix G

Page 60: PwC Review

PwC

PIs: Recommendation 3 Develop a clear process by which HSPR PI suites are reviewed

Page 60

April 2015

Issue(s) addressed:

• The process for reviewing PIs in the HSPR and PMR is not clear. While there is a technical process to review new PIs or proposed changes (for example, stress test templates), there is no clear indication of when PI reviews take place, and what the process is.

• Currently there have been frequent changes to PIs month on month which creates confusion and an inability to accurately access performance trends over a period of time.

Key recommended steps:

1. DoH and Health Services stakeholders should develop and agree a clear process for regularly reviewing the suite of PIs. This process should have the following features:

a) List of stakeholders to be involved in the review. Health Services and DoH should nominate a reasonable number of staff to be involved

b) Timeframes for review – ideally bi-annually for the first year, then annually

c) Process by which revised indicators will be communicated to stakeholders

d) Steps to review existing and proposed new PIs. A similar process to that outlined in recommendation 2 should be followed (see page 58)

Additional observations:

• Having a regular review of PIs can help prevent continuous discussions around whether the suit of PIs is appropriate, as stakeholders understand the process to review and amend these on a periodic basis.

• The Victorian Government department ‘Business Victoria’ recommends regular PI reviews to ensure that monitoring continues to be aligned with strategic objectives in a changing business environment, which is particularly relevant for WA Health in the current period of transition to ABF/M and upcoming leadership changes.

Performance indicators

Independent review of the HSPR and PMR

For additional leading practice research please see Appendix G

Page 61: PwC Review

PwC

PIs: Recommendation 4 Clarify and communicate how each target and threshold (specifically in the HSPR) has been set.

Page 61

April 2015

Recommendation part 4 (a): Clarify how each target is set

Issue(s) addressed:

• The documentation defining the targets is unclear and vague, particularly in relation to the HSPR.

Key recommended steps:

1. Obtain stakeholder feedback on appropriateness of current targets.

2. The first step is to set clear and appropriate targets which have considered the following:

a) National targets

b) Best practice benchmarks, etc..

c) Trends

d) Variations in performance, e.g. seasonal factors.

e) Cause-and-effect relationships, e.g. don’t set top level outcome targets before you have set appropriate targets for the enablers and inputs

f) Time lags (consider the Balanced scorecard and the time lags between the objectives).

3. Once appropriate targets have been set, robust documentation clearly stating the target definition, source and justification (particularly if differing from national standards) should be completed.

Additional observations:

• The Better Practice Report suggests the use of the SMART model to ensure that targets are appropriate (see Appendix G).

Recommendation 4 (b): Communicate how each target is set

Issue(s) addressed:

• There is an inadequate process currently in place to consult and communicate targets and thresholds with Stakeholders.

Key recommended steps:

1. Identify the key stakeholders requiring this information and create a distribution list.

2. Circulate target / threshold sources for each new / revised target, as well as any associated stress testing.

3. Include source information in the PMF, and HSPR / PMR.

Recommendation benefits:

• Increasing knowledge within key stakeholder groups ultimately leads to the following benefits:

− Increased buy in of the process

− Increased understanding of the purpose

− Support of the PIs being measured

− Valuable collaboration

− Increased motivation

Performance indicators

Independent review of the HSPR and PMR

For additional leading practice research please see Appendix G

Page 62: PwC Review

PwC

Recommendation 6: Provide source data together with HSPR PI results.

Issue(s) addressed:

• There is significant difficulty in drilling down into the source data to investigate further into the HSPR performance results. This is due to the DoH sourcing a significant proportion of its PI data from external providers (such as hand hygiene data and death in low-mortality DRGs). As this does not always directly link to Health Services data, Health Services cannot always access it quickly. Feedback from Health Services has been that the administrative process for requesting this data can be excessively burdensome.

Key recommended steps:

1. Provide external source data to Health Services together with the HSPR report, if possible.

2. Where this is not possible, establish process for Health Services to access external sources directly to obtain information.

Recommendation benefits:

• These processes will allow Health Services to use the data for:

− Internal performance reviews and progress planning;

− Data validation for quality assurance checks as required in the PME process; and

− Sufficient analysis for the provision of accurate commentary to accompany the HSPR results (as per section 5 recommendation 5).

PIs: Recommendations 5 and 6 Redesign data sourcing processes to ensure that all key HSPR PIs are reported with sufficient regularity, and provide source data together with the HSPR.

Page 62

April 2015

Recommendation 5: Redesign data sourcing processes to ensure that all key HSPR indicators are reported with sufficient regularity and minimal lag (minimum of quarterly if possible).

Issue(s) addressed:

• Currently nearly half of the HSPR is updated less that monthly. As such, a significant proportion of each monthly report does not contain new information on performance.

Key recommended steps:

1. Identify frequency of reporting requirements for each PI. This should be monthly unless exceptional reasons otherwise.

2. Align reporting frequency, data collection, and data availability to the purpose of reporting.

3. Clearly demonstrate the link between data sourcing and reporting purpose in the PMF and relevant reports.

Recommended benefits:

• Having information available to inform decisions and strategies is a key purpose of performance reporting.

• Data sources in the HSPR and PMR will be aligned to maintain a consistent measure of performance as a ‘single source of truth’.

• Decisions made based on the performance of a facility will be based on more up to date and reliable data.

Performance indicators

Independent review of the HSPR and PMR

For additional leading practice research please see Appendix G

Page 63: PwC Review

“To achieve agency-level accountability, indicators must not only be accurate and reliable, but must also be relevant.”

Source: NSW Performance Audit Report: Key Performance Indicators

Page 64: PwC Review

PwC

Focus area: Performance scoring

April 2015 Independent review of HSPR and PMR

Page 64

Page 65: PwC Review

PwC

Section Content Pages

Section 5

Performance scoring 64

5.1 What works well 66

5.2 Summary of issues and recommendations 68

5.3 Desktop review analysis 70

Key issue 1: Utilising an overall performance score for a Health Service does not sufficiently distinguish between areas of good and poor performance.

71

Key issue 2: There is limited understand of how the overall performance score is calculated.

72

Key issue 3: There is a high number of performance rating levels (4) for the HSPR and PMR.

73

Key issue 4: There is a significant focus on comparing Health Services in the HSPR in particular which, given their differences, is not always meaningful.

74 – 75

Key issue 5: There is currently no weighting undertaken of PIs in determining the score, meaning that the relative importance of each domain is determined only by the numbers of indicators within that domain.

76

5.4 Stakeholder feedback 77

5.5 Recommendations 80

Structure of this section

Page 65

April 2015

Performance scoring

Independent review of the HSPR and PMR

Page 66: PwC Review

5.1

What works well

Performance scoring

Page 67: PwC Review

PwC

Performance scoring - what works well Whilst the report focuses on identifying the key issues and recommendations for improvement, a number of aspects work well in relation to performance scoring.

Page 67

April 2015

There were a number of positive aspects identified through the desktop review and stakeholder engagement, in relation to performance scoring, that are deemed to work well:

1. The desktop review found the scoring methodology to be simple, with the calculation clearly explained in the ‘Performance scoring methodology’ document, (albeit there is some lack of understanding in Health Services in how this has been calculated).

2. The used of a colour key acts to highlight visually the key issues, and facilitates performance conversations around focus areas.

3. Similarly, the use of trend arrows in the PMR to show if the rating is improving or deteriorating is useful.

4. Awarding the ‘Not Performing’ facilitates with a 0 score ensures non-performance is not rewarded.

The dashboard of the PMR – although very detailed – also provides a good overview of the scores and ratings

There were positive comments on the HSPR front page ‘scorecard’.

Independent review of the HSPR and PMR

Performance scoring

Page 68: PwC Review

5.2

Performance scoring - summary of key issues and recommendations

Performance scoring

Page 69: PwC Review

PwC

1. Remove the performance score as an overall measure of Health Service performance. See page 81

2. Agree an alternative approach to assessing Health Service performance, being either (See pages 82 – 84):

− Option A: an overall domain performance score methodology with interventions triggered by domain performance (See page 83), OR

− Option B: no performance score, with interventions triggered by specific indicator performance (See page 84). This review recommends Option B.

3. Reduce rating options to 3 (highly performing, performing, under performing) using traffic light ‘heat’ maps (red, yellow, green), which is standard throughout the HSPR and PMR. Clear definitions to be provided on what each rating means. See page 85

4. Remove the Health Service ranking pages from the HSPR. See page 86

5. Communicate methodology as to how performance rating and scoring is calculated. See page 87

Scoring - summary of issues and recommendations The review identified a number of issues in relation to the performance scoring, along with suggested recommendations for improvement.

.

Page 69

April 2015

Performance scoring key issues Recommendations

1. Utilising an overall performance score for a Health Service does not sufficiently distinguish between areas of good and poor performance. See page 71

2. There is limited understanding of how the overall performance score is calculated. See page 72

3. There is a high number of performance rating levels (4) for the HSPR and PMR. See page 73

4. There is a significant focus on comparing Health Services in the HSPR in particular which, given their differences, is not always meaningful. See pages 74 – 75

5. There is currently no weighting undertaken of PIs in determining the score, meaning that the relative importance of each domain is determined only by the numbers of indicators within that domain. See page 76

Performance scoring key issues

For further information

refer to section 5.3

For further information

refer to section 5.5

Performance scoring

Independent review of the HSPR and PMR

Page 70: PwC Review

5.3

Desktop review analysis

Performance scoring

Page 71: PwC Review

PwC

• Having an overall performance score can mask indicators / facilities which are having significant performance issues, as the focus is drawn to the overall score trend.

• Problems are also caused if data is not available for a particular indicator / facility. Currently, if this is the case the indicator / facility is excluded from the overall score comparison. This could incentivise the withholding of data in certain circumstances.

• Performance scores can also adversely influence behaviours in focusing management corrective action on indicators / facilities which are close to the boundary to a higher rating (and therefore a smaller improvement can have a large impact on the score), rather than on areas of performance which are in the most need of performance improvement.

• The performance score was recently removed from the HSPR, pending this review of its suitability as a performance measurement tool, however is still reported in the PMR.

• Additionally, variance in the group of PIs at the facility level impairs the comparability of the Performance score at the Health Service level, as discussed below:

Key issue 1: Performance scoring validity The method for calculating the Performance score is clear and simple, however having an overall score can mask specific issues.

Page 71

April 2015

Performance indicators included in the Performance score can be omitted based on the scope of the service or availability of data at the required quality standard. This differs at the facility level, reducing the number of facilities that can be compared like-for-like.

Performance scores for Health Services are an average of PIs reported at the facility-level. The inherent risk of this approach is that, at a Health Service level, significantly underperforming facilities can be masked by better performing facilities, as such, there is a lack of visibility around areas requiring more extensive performance review. The Health Service Performance score is also impacted by these omissions, which impairs the accuracy of comparisons at this level.

Independent review of the HSPR and PMR

Performance scoring

Page 72: PwC Review

PwC

Key issue 2: lack of communication of scoring method Health Service stakeholders in general are confused by how the score is calculated, driven by a lack of clear communication on the methodology.

Page 72

April 2015

• The calculation methodology is simple and easy to replicate.

• However, a clear understanding is only gained through the information provided in the Performance score Methodology document.

• There is no clear indication of Performance score or Composite score calculation methodologies in the PMF, PME, or HSPR. This is likely to be the main cause of stakeholder feedback suggesting that the methodology was unclear.

• The misunderstanding may be in part to the scoring methodology not included within the ‘ABF/PMR User Guide’, and a lack of sufficient distribution of the ‘ABF and ABM: Performance score Methodology’ which explains the calculation in a clear and concise way.

Screenshot of ‘ABF and ABM: Performance score Methodology’

Independent review of the HSPR and PMR

Performance scoring

Page 73: PwC Review

PwC

Colour coding of ratings

• WA has more performance rating outcomes (4) than other jurisdictions, and this creates some confusion as to range of performance this represents.

WA Performance ratings NSW Performance ratings

Queensland Performance ratings

Key issue 3: Performance rating options In addition to the Performance rating, the HSPR uses an additional 2-threshold rating in the Health Service ranking section, which may cause confusion.

Health Services Performance rating comparison

• The process is further complicated by the inclusion within the HSPR of another 2-level performance rating (referred to as the Health Services comparison). Most stakeholders were unaware that this rating had a different meaning to the wider used one. As such, a reader is likely to misinterpret the report, assuming the ‘red’ coloured Health Services within the right hand page are ‘Not performing’ when in fact they are ‘Below Target’ as depicted below.

Page 73

April 2015 Independent review of the HSPR and PMR

Performance scoring

Page 74: PwC Review

PwC

• The Performance score was developed to provide an indication of how Health Services are performing, and to create benchmarks for comparisons between Health Services. The Performance score is included in the PMR, but was taken out of the HSPR pending a review of the methodology.

• As discussed, movements and exclusions in the PIs that construct the Performance score are equally as important as the overall Performance score itself, as it is sensitive to the underlying performance ratings (actual and potential).

• As such, the capacity to utilise the Performance score as a benchmark comparison between Health Services is impaired. One Health Service will not have exactly the same PIs and performance rating potential as another. Furthermore, comparisons at this high level ignore the critical details that must be interpreted in conjunction with the Performance score; that is, movements in performance ratings and exclusions of PIs.

• On the contrary, utilisation of the Performance score in monthly HSPR reports may not be appropriate given the frequency of data reported. For PIs that are reported less than monthly, this data cannot be refreshed and therefore the Performance score is constructed on outdated information, which reduces the accuracy of the measure. This is shown below.

Key issue 4: Health Service comparison The overall Performance score is driven by the PIs that construct it, but this varies between Health Services, reducing the comparability of the measure.

Page 74

April 2015 Independent review of the HSPR and PMR

Performance scoring

Page 75: PwC Review

PwC

Key issue 4: The impact of exclusions Excluded / unavailable information can skew the calculation of the overall performance score.

Page 75

April 2015

Exclusion category

Application

Out of Scope (OS)

assigned when a particular type of activity does not apply at a facility level

Not Calculable (NC)

assigned where a value is divided by zero

Numbers Withheld (NW)

assigned if the number of cases is less than the agreed reporting threshold

Not Available (NA)

assigned when data or a data quality statement is not provided / available by the reporting deadline

Example Domain

Code EA4a EA4b EA4c Total

Performance score

Example 1 (all data

available)

Actual Rating 3 0 2 5 45

Max Rating 3 4.0 4 11

Example 2 (some data withheld or

not available

Actual Rating 3 NA 2 5

71

Max Rating 3 NA 4 7

• The Performance score is calculated based on the performance ratings for each PI. However, indicators can be excluded if it is out of scope (in regards to the service providers’ model of service), or if there are issues with data availability and quality (see table to the right).

• The Performance score is sensitive to changes in the underlying performance ratings, therefore exclusions can have a considerable impact on the overall score. Movements and exclusions in performance ratings are equally important when interpreting the Performance score. Excluded indicators will be denoted by assigning it to one of four categories (see table to the right).

• While data under the NA category will be updated in the next reporting period (if available), the key issue is that the monthly HSPR does not track movements in performance ratings, and as such changes in previous ratings cannot be identified.

• During the leading practice research it was evident that one Jurisdiction has chosen to mitigate the above risk by accounting for “accepted force majeure claims that have impacted on performance”. Where exclusion circumstances arise, the impact on performance ratings can be controlled by using historical data to project the performance rating, and adjusting once data becomes available.

Exclusion categories

Performance score example

Independent review of the HSPR and PMR

Performance scoring

Page 76: PwC Review

PwC

Weighting of PIs

• Performance indicators in the PMR are equally weighted to calculate the Performance score. However, this may not accurately reflect priority areas. As shown in the table to the right, the number of indicators in each domain also acts to weight the domain, but this may not accurately reflect priority domains.

• Leading practice research has demonstrated that other jurisdictions have incorporated weightings into their performance score to ensure that priority indicators remain a focus for service providers. This model varies the weight of performance ratings to factor in priority areas: for example, priority indicators award additional points for achieving performance.

• As shown to the right, one jurisdiction has chosen to have a higher weighting for SAB rates than for HiPS. Service providers are therefore provided an incentive to highly perform against SAB measures. This could be aligned to strategic objectives to ensure priorities are adequately set in service delivery.

• Applying weightings to PIs supports the link between performance measures and strategic objectives, and provides incentive for service providers to apply concerted effort in line with strategic direction.

Key issue 5: PIs weighting WA PMF weights PIs equally, in contrast to leading practice research where greater importance is placed on priority PIs.

Page 76

April 2015

Domain Dimension % of

dimension* % of

domain*

Effectiveness Access 37

47 Appropriateness 15

Quality 48

Efficiency Inputs per output unit 100 19

Equity Access 100 9

Sustainability Workforce 100 11

Facilities & equipment 0

Processes Coding 38 14

Finance 63

The number of PIs determines the weight given to that dimension or domain. For example, Quality has the highest priority dimension in the Effectiveness domain. Similarly, Effectiveness is the highest priority domain.

leading practice analyser – national example

WA PMR current assessment

This jurisdiction places greater priority on specific PIs by having indicator-specific weightings, which act as additional incentives for service providers to apply concerted effort.

Independent review of the HSPR and PMR

Performance scoring

Page 77: PwC Review

5.4

Stakeholder feedback

Performance scoring

Page 78: PwC Review

PwC

Perception profiling – performance scoring The scoring profiles and key issues raised within the workshops are shown below.

Page 78

April 2015

• Two questions relating to performance scoring were included within the workshops, as part of the wider PME questions.

• As demonstrated within the PI scoring, the views of the Health Services were more negative than DoH, with an average score of 1.4 compared to 3.4.

• The widest diversion of opinion amongst all the questions asked of stakeholders across the 4 focus areas was in relation to the overall score being an appropriate measurement tool. Whilst most within DoH somewhat agreed it was a useful indicator, the Health Services strongly disagreed on its inclusion.

• Further drill down during discussions indicated concern with it being used as comparator tool, with many suggesting comparing Health Services was inappropriate due to the differences between them.

• Additionally, stakeholders were unsure how the score is calculated, with many assuming it was complex in methodology.

• Health Services also raised the issue of an inability to replicate the scoring due to limited availability of data.

1.00

2.00

3.00

4.00

5.00

The roles andresponsibilities for staffinvolved in the Health

Service PME process areclearly defined

The process by whichHealth Services

performance is monitoredis clear

The performance score iscalculated in a transparent

and understandablemanner

Having an overallperformance score is anappropriate measure of

Health Serviceperformance

DoHAverage score

Health ServicesAverage score

It is noted that these views are indicative and only represent the views of the attendees. For full scoring profiles please refer to Appendix D

Independent review of the HSPR and PMR

Performance scoring

Page 79: PwC Review

PwC

Scoring - key issues identified by stakeholders The key issues raised by the workshops and the individual stakeholder interviews are shown below. Area Key issue Commentary

Performance scoring

The methodology to calculate the performance scoring is complicated

• There is limited understanding on how the performance scoring is calculated

• There is limited understanding on how the performance rating is calculated

• There is limited confidence in replicating the scoring

Mixed views on appropriateness of overall performance score

• The majority of feedback received indicated that having a performance score is inappropriate, as it does not provide a sufficient view of Health Service Performance

• Different views on whether performance scores are an appropriate basis for intervention

Page 79

April 2015 Independent review of the HSPR and PMR

Performance scoring

Page 80: PwC Review

5.5

Recommendations

Performance scoring

Page 81: PwC Review

PwC

Issue(s) addressed:

• There are two main rationales for having an overall Health Services performance score in place – the first is to attempt to provide an overall view of performance, and the second is to use the scores as thresholds for where the DoH should undertake an intervention.

• However, having an overall performance score can hide indicators / facilities which are having specific issues, as the focus is drawn to the overall score trend.

• The performance scoring was recently removed from the HSPR - with this being the more widely used report, the value of the scoring is consequently undermined.

Key recommended steps:

1. Remove the performance score as an overall indicator.

2. Implement an alternative approach to assessing and intervening in Health Services’ performance (see recommendation 2 overleaf).

Additional observations:

• This recommendation is supported through findings of the Mid Staffordshire NHS Foundation Trust Public Inquiry where a key failing noted was around the ability to ‘hide’ performance issues through the overall rating of a balanced scorecard: “The evidence suggests that officials from the Shropshire and Staffordshire Strategic Health Authority were unconcerned at these developments, and thought that the loss of stars was mainly due to poor record keeping as they chose to rely on results of the balanced scorecard.”

Performance scoring: Recommendation 1 Remove the performance score as an overall measure of Health Service performance.

Page 81

April 2015

This recommendation can be further supported with QLD and NSW also choosing to use a performance rating instead of score as the indication of Health Service / facility performance. Each individual PI is monitored and used to inform intervention mechanisms.

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Independent review of the HSPR and PMR

Performance scoring

Page 82: PwC Review

PwC

Performance scoring: Recommendation 2 Agree an alternative approach to assessing and intervening in Health Service performance.

Page 82

April 2015

Based on the key issues identified in Recommendation 1, two alternative scoring approaches are suggested.

Option A: an overall domain performance score

Key recommended steps:

1. Define the domains/ strategies that require performance management and relevant aligned PIs to be measured (see section 3 recommendation 1).

2. Develop an overall domain performance score methodology by applying the current overall performance score methodology to individual domains (see overleaf for further details).

3. Develop intervention trigger-points for each domain score, in consultation with Health Services.

Option A recommendation benefits:

• A summary of performance can be provided at-a-glance.

• It allows domains to be treated individually.

• It allows a tailored overall approach to action performance issues.

Option B: performance ratings to be used

Key recommended steps:

1. Define the suite of PIs to be measured (see section 3 recommendation 1).

2. Update current performance ratings (see recommendation 3).

3. Utilise individual PI performance ratings as intervention triggers (see overleaf for further details).

Option B recommendation benefits:

• Specific PIs can be addressed and tailored action plans created, with the risk of hiding underperformance or manipulating overall results reduced.

Additional observations: advantages and disadvantages of composite scoring

• The WHO report identifies advantages and disadvantages to composite scoring that should be considered when deciding on a preferred option.

Advantages Disadvantages

Offer a broad assessment of system performance

May disguise failings in specific parts of the health system

Place system performance at the centre of the policy arena

Make it difficult to determine where poor performance is occurring, making policy and planning more difficult and less effective

Enable judgement and cross-national comparison of health system efficiency

Often can lead to double counting, because of high positive correlation

Offer policy-makers at all levels the freedom to concentrate on areas where improvements are most readily secured, in contrast to piecemeal PIs

May use feeble data when seeking to cover many areas, which may make the methodological soundness of the entire indicator questionable

Clearly indicate which systems represent the best overall performance and improvement efforts

May make individual measures used contentious and hidden, due to aggregation of the data

Can stimulate better data collection and analytical efforts across health systems and nations

May ignore aspects of performance that are difficult to measure, leading to adverse behavioural effects

Independent review of the HSPR and PMR

Performance scoring

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Page 83: PwC Review

PwC

Performance scoring: Recommendation 2 option A Agree an alternative approach to assessing Health Service performance: an overall domain performance score methodology with interventions triggered by domain.

Page 83

April 2015

Option A: an overall domain performance score methodology with interventions triggered by domain performance.

• This is currently in practice in Victoria Health. The PMF explains that the performance rankings of each PI within a domain are summed to give the total Performance Assessment score (PAS). This aggregate score reflects overall performance within that domain.

• Following this, differing levels of intervention are applied based on the range that the PAS falls within. These include for example different frequencies of performance meetings between DoH and Health Services and differing requirements for mitigation strategies.

• The advantage of establishing domain aggregate scores is that they can be used to flag areas for deeper analysis, before specific PIs are investigated. Domains also generally align to role responsibilities within the Health Services (e.g. Finance under the responsibility of the Finance Director), so accountability for rectifying performance can easily be established.

Additional observations – what Victoria does well

1. First, each PI is scored based on performance measured against targets and thresholds.

2. The total score for each domain is summed to determine the PAS.

3. Finally, the intensity of monitoring is determined based on the range which the PAS falls within.

1

3

2

Independent review of the HSPR and PMR

Performance scoring

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Page 84: PwC Review

PwC

Performance scoring: Recommendation 2 option B Agree an alternative approach to assessing Health Service performance: No performance score, with interventions triggered by specific indicator performance.

Page 84

April 2015

Option B: no performance score, with interventions triggered by specific indicator performance.

• This option involves monitoring and instigating interventions based on individual PI performance ratings.

• This is currently the practice in Queensland Health and NSW Health. In both of these jurisdictions, the individual performance rating is the trigger for intervention.

• For example, the NSW PMF shows the criteria for assessing performance based on the specific tier of the PIs, as well as strategic priorities and recovery plans (as shown).

• The WA DoH currently applies a Performance Rating to each PI within the HSPR and PMR. If the performance score is removed, the individual performance ratings must be clearly reported and identified as triggers for intervention.

• The advantage of this approach is that specific performance issues are identified and focused on for improvement. There is much less risk that an underperforming PI can be balanced out by an overperforming PI elsewhere, a risk that still exists if the domain composite scoring system is used.

• This review recommends Option B based on the benefits and drawbacks in comparison to other option.

Additional observations – what NSW does well

Individual performance ratings for each PI are currently already monitored and reported in the HSPR and PMR. Intra- and inter- comparisons could be based on the specific PI to maximise like-for-like comparability.

Independent review of the HSPR and PMR

Performance scoring

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Page 85: PwC Review

PwC

Performance scoring: Recommendation 3 Reduce rating options to 3 using a traffic light ‘heat’ maps, throughout the HSPR and PMR with clear definitions provided.

Page 85

April 2015

Issue(s) addressed:

• When compared to the leading practice research WA has more PI rating options (4) than is deemed optimal.

• Additionally, the HSPR uses a different 2-tier rating system for the same PIs in separate sections of the report, which creates further confusion to users.

Key recommended steps:

1. Replace the current 4 and 2 tier rating systems with a consistent three tier rating system showing:

- Performing, Underperforming, and Not Performing

2. Clearly define what each rating means (see examples to the right).

3. Use traffic light ‘heat’ maps to highlight facilities / Health Services in the HSPR and PMR.

Recommendation benefits:

• The advantage of this system is that it simplifies the performance rating methodology, making results more straightforward and reducing the risk of confusion or misinterpretation.

• This also simplifies threshold calculations, making performance results easier to understand and interpret.

• This recommendation was supported in the stakeholder engagement, with many commenting on their preference to adopt a simpler 3 option ‘traffic light’ approach.

Additional observations – what QLD & NSW does well

• Queensland Health and NSW Health currently use the traffic light system to rate performance.

• As shown, there is also a clear definition of what each performance rating means. The colour coding provides a quick indication of the performance rating for a particular PI.

QLD Health use a simple traffic light system to rate performance against each PI.

NSW Health also use a traffic light system to rate performance against each PI. The figure in the PMF (shown above) defines each performance rating and facilitates interpretation of the results.

Independent review of the HSPR and PMR

Performance scoring

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Page 86: PwC Review

PwC

Performance scoring: Recommendation 4 Remove the Health Services ranking pages from the HSPR.

Page 86

April 2015

Issue(s) addressed:

• The current Health Services ranking pages in the HSPR (see right) to a large extent duplicate what can be ascertained from the front page, which has the results of Health Services for each indicator side-by-side.

• An overly excessive focus on presenting comparisons between Health Services is not a fair like for like comparison – mainly due to fact that the different Health Services have different service profiles (WACHS and CAHS in particular).

Key recommended steps:

1. Remove the Health Services ranking page.

2. Replace with information which analyses performance trends at each Health Service / facility.

Additional observations:

• There are motivational benefits from the ‘competition’ factor of rankings - however, this is only appropriate where there is sufficient comparability in the Health Service.

If the domain performance scores are chosen as the recommendation 2 preferred option (option a), this should be presented on a Health Service by Health Service basis, split down by facility. This is similar to the Health Service scorecard summarised within the PMR detailed reports.

Independent review of the HSPR and PMR

Performance scoring

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Page 87: PwC Review

PwC

Performance scoring: Recommendation 5 Communicate the methodology on how performance ratings and scoring are calculated.

Page 87

April 2015

Issue(s) addressed:

• Many stakeholders fed back that the performance scoring methodology is complex.

• However as the methodology is in essence clear and simple, the key issue is a lack of clear communication with necessary stakeholders.

Key recommended steps:

1. Agree approach to assessing and intervening in Health Service performance (Recommendation 2).

2. Develop ‘Performance score Methodology’ document - similar to the one currently in existence, which outlines the new approach.

3. Circulate the new methodology to all users and key stakeholders. Also include a clear reference to this document in the HSPR.

Recommendation benefits:

• This approach will ensure the performance rating /scoring methodology is well communicated to stakeholders to mitigate further misunderstanding.

• Having clear and transparent performance rating methodologies will help foster buy-in from the Health Services, and upholds the principles of openness and transparency stated in the PMF.

The existing Performance score Methodology’ document is deemed to be a good example to use to communicate the changes.

Independent review of the HSPR and PMR

Performance scoring

Should another approach to assessing Health Service be adopted whereby weighting the PIs is deemed appropriate, please see additional information in Appendix G

Page 88: PwC Review

PwC

Focus area: Reporting structure and content

April 2015 Independent review of HSPR and PMR

Page 88

Page 89: PwC Review

PwC

Section Content Pages

Section 6

Reporting structure and content 88

6.1 What works well 90

6.2 Summary of issues and recommendations 93

6.3 Desktop review analysis 95

Key issue 1: The HSPR does not give a clear indication of an organisation’s performance.

96 – 100

Key issue 2: The PMR does not give a clear indication of an organisations performance.

101 – 106

Key issue 3: Lack of alignment between the two reports. 107

Key issue 4: No risk section. 108

6.4 Stakeholder feedback 109

6.5 Recommendations 112

Structure of this section

Page 89

April 2015

Reporting structure and content

Independent review of the HSPR and PMR

Page 90: PwC Review

6.1

What works well

Reporting structure and content

Page 91: PwC Review

PwC

PMR structure and content - what works well Whilst the report focuses on key issues/improvements, the review has highlighted a number of aspects that work well in relation to PMR structure and content.

.

.

Page 91

April 2015

There were a number of positive aspects identified through the desktop review and stakeholder engagement, in relation to the PMR reporting structure and content, that are deemed to work well:

1. The publication of the detailed PMR reports is a time and resource intensive process, however the production process appears to meet deadlines and provide sufficient review time.

2. The appendices attached to the PMR provide a quick point of reference for detailed technical information, which ensures that the report is interpreted correctly.

3. The visual layout and presentation of the majority of the report works well and aligns with best practice, particularly in regards to the following:

− The formality of the front page acts to highlight its importance as a document (in contrast to the HSPR).

− Use of standard WA Health templates increases the familiarity of the report for stakeholders.

− Inclusion of an introductions and tables list.

− Inclusion of a ‘purpose statement’.

− The use of trend analysis within the PMR scorecard.

− The level of trend analysis for each facility included within the detailed reports.

− Combining the detail within the commentary in one report.

The detail included within the PMR is comprehensive with a full profile given to each facility and not just Health Services. Additionally, the PMR focuses more on facility comparison than Health Service comparison, which is also deemed to work well.

Independent review of the HSPR and PMR

Reporting structure and content

Page 92: PwC Review

PwC

HSPR structure and content - what works well

Page 92

April 2015

There were a number of positive aspects identified through the desktop review and stakeholder engagement, in relation to HSPR reporting structure and content, that are deemed to work well:

1. The HSPR is a streamlined report which is easy to navigate.

2. The HSPR scorecard, in particular, is visually impactful and draws attention to poor / high performance (compared to target and/or other Health Services).

3. Senior stakeholders in DoH and WA Health Services clearly use and, in general, understand the report.

4. Facility level performance pages allow peer-to-peer comparisons which stakeholders found very useful.

5. The HSPR has been a widely successful tool in promoting discussions around performance, and whilst it can be improved upon, the level of reliance on it was evident within the stakeholder engagements.

6. Factual commentary is provided on the performance of each indicator.

Independent review of the HSPR and PMR

Whilst the report focuses on key issues/improvements, the review has highlighted a number of aspects that work well in relation to HSPR structure and content.

Reporting structure and content

Page 93: PwC Review

6.2

Reporting structure and content - summary of key issues and recommendations

Reporting structure and content

Page 94: PwC Review

PwC

Reporting - summary of issues and recommendations The review identified a number of issues in relation to the structure and content of the reports, along with suggested recommendations for improvement.

Page 94

April 2015

1. Inconsistent and fragmented structure of PME meetings

2. No defined accountability

3. Distribution of the HSPR report is unclear and undocumented

4. No mechanism for highlighting consistent underperforming or sustained performance improvements

Performance monitoring and evaluation key issues

1. Merge the HSPR and PMR into one report (Combined report), with the HSPR acting as an overview (Dashboard section) and the PMR as the detailed backup (Detailed section). See page 113

2. Determine the structure and reporting frequency, including PIs to be included. See pages 114 – 116

− Option a: Full monthly ‘Combined Report’

− Option b: Dashboard section is published monthly , quarterly Detailed section

− Option c: As per Option b, however certain PIs get reported more frequently if underperforming. This review recommends Option C.

3. Improve signposting and visual layout of the HSPR scorecard (targets / thresholds / variance, clarity on time periods, trends, etc.). See page 117

4. Combine the commentary with the main HSPR and PMR reports. See page 118

5. The Health Services write the HSPR commentary, to explain the drivers for performance and potential performance improvement strategies. See page 118

6. Include data drill down facilities. See pages 119 – 120

7. Include section on risk and mitigation strategies in both reports. See page 121

Recommendations

1. The HSPR does not give a clear indication of an organisation’s performance, because of (See pages 96 – 100):

• Confusing composite scoring of PIs

• Lack of trend analysis

• Lack of target, actual, variance

• Some confusing labelling and signposting

• Factual (rather than explanatory) commentary

• Commentary is produced separately.

2. The PMR does not give a clear indication of an organisations performance, because of (See pages 101 – 106):

• scorecard structure complexity (high volume of information on each page)

• Some confusing labelling and signposting.

3. There is a lack of alignment between the two reports. See page 107

4. There is no risk section in either report. See page 108

Reporting structure and content key issues

For further information

refer to section 6.5

For further information

refer to section 6.3

Reporting structure and content

Independent review of the HSPR and PMR

Page 95: PwC Review

6.3

Desktop review analysis

Reporting structure and content

Page 96: PwC Review

PwC

Key issue 1: HSPR overview observations The key observations made of the structure and content of the HSPR are made below. Further observations are made in later sections.

• The report is heavily weighted on comparing the service provider, however, no information is provided regarding facilities who have improved/ deteriorated in performance and the reasons behind the change. It is this information that will drive improvements in performance of health providers.

• With the focus on Health Service comparison, the score for the total Health Service is based on an average of all the facilities. As such, significant areas of under performance could be hidden in a highly performing facility. This could potentially undermine the validity of the performance score.

• No introduction is provided detailing the purpose or intent of the report to help mitigate reader confusion.

• The commentary report is heavily descriptive with limited trend information discussed.

• Limited trend data is shown, with no ‘actual, target, variance’ to inform the extend of the performance issue.

• The HSPR utilises two performance ratings: the standard (Not Performing– Highly Performing) and a Health Services comparison (Below Target – At or above Target). This adds another level of complication potentially leading to a misunderstanding of the ratings and heat map shown.

• The review suggested the reports’ signposting has some limitations with its visual impact undermined when compared to other examples.

Further analysis on specific sections of the HSPR are shown overleaf.

Page 96

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 97: PwC Review

PwC

HSPR ‘scorecard’ Health Services ranking page

Key issue 1: HSPR scorecard The HSRP ‘scorecard’ is well liked by stakeholders and visually impactful, however it lacks some key information.

Page 97

April 2015

There is a lack of clear description around which time period (i.e. YTD, in month) the information relates to

No contents / introduction is provided which navigates the user through the report

Lack of trend analysis which highlights whether performance is improving / deteriorating

Targets are clearly shown, however threshold levels are not

Limited value in having the Health Services ranking page, as a comparison between Health Services can be obtained from the scorecard page

No commentary is provided with these pages to explain the performance

Independent review of the HSPR and PMR

Reporting structure and content

Page 98: PwC Review

PwC

Key issue 1: HSPR commentary The HSPR commentary is largely fact based, and therefore does not sufficiently explain the drivers of performance results and trends.

Page 98

April 2015

The HSPR shows PI targets and comments on the performance rating, but there is no indication of thresholds for each rating.

The commentary provides details on the achievement against targets, but does not include any information on the key drivers of current performance.

The Better Practice Report highlights the necessity for all metrics in performance reports to be sufficiently explained, to avoid misinterpretation and allow the end-user is able to put the results into context. While the HSPR lists the targets for various PIs, there is a lack of sufficient explanation in the Commentary Report regarding the target source, and performance thresholds related to performance ratings. This makes it difficult for facilities to set targets for progressive improvement, and impairs the validity of the targets as evidence-based benchmarks.

The actual results achieved by each Health Service is sometimes not shown, particularly if there was no change from last month. This makes if difficult to get a view of a Health Service’s performance.

Independent review of the HSPR and PMR

Reporting structure and content

Page 99: PwC Review

PwC

Key issue 1: HSPR commentary (con’t) Having the HSPR report and the commentary separate makes it difficult to easily assess performance.

• Other jurisdictions and international leading practice examples all combine their commentary and results (graphs, scorecards etc.) into one report. This improves cohesiveness and allows individual indicators to be explored in greater detail.

• A number of stakeholders who use the reports to inform their understanding of Health Service performance may only be interested in specific indicators. By combining the information around that indictor or domain, usability is improved, potentially increasing the success of the report as a driver for improvement.

• Additionally, with no table numbers used within the HSPR, the description pertaining to the graphs / tables within the HSPR Commentary report is at times confusing and hard to follow.

• In contrast, the DoH Annual Report follows good practice in combining commentary and the results clearly.

Page 99

April 2015

Difficult to read minus sign, however, visually this is clear. This highlights the risk of misinterpretation through reading each report in isolation.

Independent review of the HSPR and PMR

Reporting structure and content

Page 100: PwC Review

PwC

Key issue 1: Lack of target, actual, variance With no ‘target, actual, variance’ shown within the HSPR the breadth of under performance /over performance is not known, as such, key issues could be missed.

Page 100

April 2015

• With no thresholds shown within the HSPR, or ‘target, actual, variance’ explicitly referenced, the actual performance of a facility / Health Service is unclear.

• As such, serious performance issues may be masked through the use of a ‘simple’ rating.

• Additionally, limited information is given within the commentary to suggest level of under / over achievement.

• Furthermore, there should be a focus on explaining the root cause of the rating in terms of individual facilities. With the focus given to Health Service comparison, severely underperforming facilities can be hidden by over performing ones based on the inherent ‘averaging’ methodology of the scoring. As such, there should be a mechanism to explicitly call out the worst achieving facilities and reasons.

Independent review of the HSPR and PMR

Reporting structure and content

Page 101: PwC Review

PwC

Key issue 2: PMR overview observations The key observations made of the structure and content of the PMR are made below. Further observations are made in later sections.

• Both PMR reports (the full PMR and the monthly summary report) provide significant amounts of detail, but do not sufficiently highlight key areas of underperformance.

• There is limited commentary on the key drivers or the reasons behind current performance.

• PIs within the scorecard are referred to only by their code, as such, the reader is required to turn to another page for the PI definitions.

• Whilst detailed, the scorecard is suggested to be overly complicated (36 potential key options), with its detailed presentation undermining its visual impact.

• The focus of the scorecard is on performance ratings as opposed to improvements or deteriorations. Commentary would be useful as to how performance was improved and what methods they adopted.

• The commentary also does not include any associated risks with the non achievement of these targets and any actions that need to be or have been taken.

• The performance scoring provides a ranking of Health Services, however with differences in the number of PIs in each facility the comparison is not always like-for-like.

• Whilst the PMR assigns PIs to specific domains, no real indication of performance against these is shown by facility or Health Service, as such, their value is undermined.

Further analysis on specific sections of the PMR are shown overleaf.

Page 101

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 102: PwC Review

PwC

Key issue 2: PMR scorecards The key observations on the structure and content of the scorecard are made below.

Overall performance not shown in initial table

Different date ranges shown potentially leading to duplication of information previously reported

Exclusions / additional comments are shown per KPI as opposed to per hospital, thus making it difficult to read commentary associated with underperforming or deteriorating hospitals

The titles of the PIs are not written out in full

No definition of what the arrows mean

Service providers not shown in any chronological order based on ranking

Hard to follow visually (small and visually grey is overwhelming)

A high number of PIs for specific facilities are deemed to be “out of scope”

Small text increases difficulty to read and understand information presented

No commentary or further analysis shown where a hospital is deemed to have a worsening performance

Page 102

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 103: PwC Review

PwC

Key issue 2: PMR detailed reports key messages can be lost in the detailed PMR reports.

Difficult to read the graphs shown to the right due to size of text without zooming in.

Too much information on one page means the key messages are lost.

Detailed report provided for all KPIs, which, whilst being comprehensive, may mean the important messages around underperforming service providers are lost in the detail.

Small text increases difficulty in reading.

Page 103

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 104: PwC Review

PwC

Key issue 2: PMR performance scoring Key observations from the desktop review of the layout and formatting of the performance scoring table are depicted below.

The performance scoring table shows the total “actual” scores for each Health Service, and the individual facility (i.e. total of all points from all indicators). However, as the total number of PIs measured is different for each facility, the actual scores themselves are not comparable. A percentage calculated from total possible score (for that facility) would be more appropriate.

The note states “Rankings should be interpreted with care since each facility’s performance score may be calculated from a different number of indicators. Not all indicators are in scope for each facility.” This is fundamental to the comparison and as such this may be a unfair method to present facility performance.

It is unclear if the facility’s / service providers are ranked for December alone (in this example) or total across the year.

Performance movements from each month are hard to decipher from the table.

Page 104

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 105: PwC Review

PwC

Key issue 2: PMR performance rating key Whilst the scorecard is comprehensive in the level of detail shown, its presentation is complex with a three tier key.

The colour key is extensive and sits outside best practice guidelines for ‘heat maps’ whereby it goes from green (good) to red (bad). By including green as ‘numbers withheld’ is it potentially confusing for a new reader and may lead to bias reading. Additionally the greys are hard to differentiate.

WA performance rating WA Heat map

Highly performing

Performing

Underperforming

Not performing

Out of scope

Not calculable

Numbers withheld

Not available

3 – increase of three

rating since previous

period

2 – increase of two

rating since previous

period

1 - increase of one

rating since previous

period

0 – rating has not

changed since previous

period

-1 – decrease in one

rating since previous

period

-2 – decrease in two

rating since previous

period

-3 - -1 – decrease in

three rating since

previous period

Blank – movement

from previous rating not

calculable

The focus is on the performance rating as opposed to improvements or deteriorations. This is evident below where one facility has improved since last month, however is still classes as underperforming. Commentary would be useful as to how performance was improved and what methods they adopted.

Bias key showing zero as unmoved from previous report. Zero suggests a negative outcome which may not be the case here.

Complicated key (36 potential outcomes).

Page 105

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 106: PwC Review

PwC

Key issue 2: PMR indication of domain performance Whilst the PMR assigns PIs to specific domains, no real indication of performance against these is shown by facility or Health Service.

Page 106

April 2015

UK Health Monitor framework example

• Whilst the PMR assigns PIs to specific domains, and groups them in presentation, no overall domain performance is summarised within the report.

• One leading practice example is the UK, where each of their ‘domains’ are summarised with commentary on the key messages.

• Additionally, as is shown in the example, performance drivers are listed to ensure key stakeholders are aware of priority areas moving forward.

Independent review of the HSPR and PMR

Reporting structure and content

Page 107: PwC Review

PwC

Key issue 3: Alignment between HSPR & PMR There appears to be little alignment between the reports, with both treated as stand alone documents, despite their overarching similar objectives.

Page 107

April 2015

4. Both the HSPR and PMR facilitate different information and uses. The PMR summarises each Health Service with a scorecard and allows the user to drill down into the results of each PI for each facility within that category. In comparison, the HSPR has limited drill down and instead focuses on showing the Health Services performance in a comparable manner.

5. The audience to which the reports are distributed to, as well as the method in which they are distributed, differ.

The HSPR and PMR should have the same objective; to drive performance improvements within WA Health. The HSPR was drafted to summarise on key PIs from the PMR. As such, there should be alignment between the two reports, with a consistent message and method of presenting used. However, this appears to not be the case, as summarised below:

1. The use of different visual aids and methods of data presenting acts to separate the documents.

2. There is a lack of synergy between the HSPR and PMR PIs and targets. This has lead to confusion over which report is the more accurate measure, a result that is inconsistent with the ‘single source of truth’ data principle.

3. The lack of structural alignment between the reports can further be highlighted by the inconsistent use of performance scoring, trend highlighting, and commentary. In particular the PMR provides a performance scorecard, and high level internal trend rating to demonstrate movement. These tools are not used in HSPR. In comparison, the HSPR report contains a separate document to provide commentary and explanation for the monthly results – a tool not employed by the PMR.

PMR scorecard and detailed report

HSPR report

Independent review of the HSPR and PMR

Reporting structure and content

Page 108: PwC Review

PwC

Key issue 4: No risk section within either report No risk section is included within either the PMR or the HSPR, in contrast to leading practice jurisdictions.

Page 108

April 2015

Example risk sections from other jurisdictions’ reports • There is currently no risk section included within the PMR or HSPR. Most leading practice performance reports – particularly in the UK – include a risk section (with mitigation strategies / actions) as part of performance reports.

• WA Health has a clear risk framework which is actively managed by Health Services. However, stakeholder feedback has been that there is little discussion of risk (particularly clinical risk) between Health Services and DoH as part of performance discussions.

• A lack of risk management focus on the part of the organisation and external parties was identified as a key cause of the failures at Mid Staffordshire NHS Trust.

Findings / quotes from Mid Staffordshire Public Enquiry

“Identifying systems and processes and meeting targets were the main measures of performance. Outcomes-based performance and risk-based, intelligence-informed regulation were still developing concepts.” “Routine and risk-related monitoring, as opposed to acceptance of self-declarations of compliance, is essential.”

Independent review of the HSPR and PMR

Reporting structure and content

Page 109: PwC Review

6.4

Stakeholder feedback

Reporting structure and content

Page 110: PwC Review

PwC

Reporting: key issues identified by stakeholders The key issues raised at the workshops and the individual stakeholder interviews in relation to the reporting structure and content are shown below. Area Key issue Commentary

Reporting layout and content

Some issues with presentation of data in the HSPR

• No trend analysis is shown within the HSPR

• Lack of “target, actual, variance” display

• There is too much focus on Health Service’s comparison in the HSPR

Some lack of signposting in the PMR

• The presentation of the scorecard in the PMR is visually complex

• Many of the charts / data lack descriptions and explanations

Questionable whether it is of sufficient value to produce both reports

• Mixed views on the value of producing two reports

• Lack of alignment in how the reports are produced (with one not being linked to the other) means that they are reviewed in isolation (with senior executives currently not using the PMR)

Reports produced duplicate other reports

• HSPR and PMR duplicate with other reports (e.g. Whole of Health Dashboard)

Report structures do not align to Health Service Governance structures

• Examples of such are Area Mental Health Services, Hospital Groups (e.g. SCGOPHCG)

Commentary of the reports are separate and fact focused

• Current commentary is very fact based, and does not explain root causes behind the performance

• Having the commentary published in a separate report to the main report makes it difficult to review in conjunction with the data

• Commentary is produced by the DoH, rather than by those who are responsible for the performance

Page 110

April 2015 Independent review of the HSPR and PMR

Reporting structure and content

Page 111: PwC Review

PwC

1.00

2.00

3.00

4.00

5.00

The PMR report’s contains sufficient

content and detail to present a full and

balanced picture of performance

The structure andlayout of the PMRreport is clear andeasy to understand

The HSPR report’s contain sufficient

content and detail to present a full and

balanced picture of performance

The structure, layoutand commentary ofHSPR the reports is

clear and easy tounderstand

There is sufficientvalue in producing

both reports in theircurrent format

DoHAverage score

Health ServicesAverage score

Perception profiling – reporting content & layout The scoring profiles assessed within the workshops pertaining to the reporting structure are shown below.

Page 111

April 2015

• Similar to the other focus areas, the indicative perception of the Health Services in relation to the reporting content and structure, based on the scoring provided within the workshop, were more negatively aligned than DoH. This is reflected in an average score of 2.28 compared to 3.34.

• However, in relation to the structure of the HSPR, the Health Services were more positive. This was apparent in the scoring, with questions relating specifically to the HSPR receiving a higher score than PMR related questions. This is likely due to the HSPR being the report of choice for many.

• The largest difference in opinion within this section related to the structure of the PMR, with the average score amongst DoH of 4 compared to 2 within Health Services. This was supported with comments criticising the PMR layout and navigation complexity of the report.

• Health Services confirmed that the PMR is required for drill down purposes, with a lot of the backup information not currently included within the HSPR.

Answer key:

It is noted that these views are indicative and only represent the views of the attendees. For full scoring profiles please refer to Appendix D

Independent review of the HSPR and PMR

Reporting structure and content

Page 112: PwC Review

6.5

Recommendations

Reporting structure and content

Page 113: PwC Review

PwC

Report structure & content: Recommendation 1 Merge the HSPR and PMR into one ‘combined report’, with the HSPR acting as an overview ‘Dashboard’.

Page 113

April 2015

Issue(s) addressed:

• There is currently a lack of alignment between the two reports.

• Since the introduction of the HSPR, the PMR is used less as performance management tool (this is demonstrated by the sharp reduction in the number of PMR website hits, and backed-up by stakeholder feedback).

• Whilst the HSPR is more frequently used, it does not provide sufficient information to adequately assess all key areas of performance.

Key recommended steps:

1. Combined the HSPR and PMR into a ‘Combined Report’ (with a suitable name developed).

2. This will consist of a ‘Dashboard’ section (similar to a redesigned version of the HSPR) and a ‘Detailed’ section (similar to a redesigned version of the PMR).

3. Later recommendations in this section, and Appendix G, outline what should be included in these 2 new sections.

Recommendation benefits:

• Only one report will be required to be drafted.

• There will be a single source of truth for all stakeholders.

• A more effective reporting layout and structure can be developed to better suit the needs of key stakeholders to inform on performance issues and action required.

The current HSPR should be refined and become the ‘Dashboard’ section of the ‘Combined Report’.

The current PMR should be refined and become the ‘Detailed’ section of the ‘Combined Report’.

The Combined Report will naturally have a structural link between the two sections.

Independent review of the HSPR and PMR

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 114: PwC Review

PwC

Report structure & content: Recommendation 2

Page 114

April 2015

Based on the previous recommendation to combine the report, the following options are proposed:

Option A: Full ‘Combined Report’ published on a monthly basis

Features

• This option involves publishing the Dashboard section (an expanded and developed version of the HSPR), along with the full Detailed Report (a developed version of the PMR) together to form the Combined Report.

• The Dashboard section consists of performance information for the expanded suite of HSPR PIs that are sourced monthly.

• The Detailed section provides performance information against the full suite of PIs.

Benefits

• There is a clear link between the Dashboard section and the Detailed section.

• The full Combined Report with all information is wholly accessible every reporting period.

Drawbacks

• Data that is sourced less than monthly will not change, therefore parts of the report will be outdated.

• Resource intensive to produce the full report every month.

Option B: Dashboard section is published monthly , quarterly Detailed section

Features

• The Dashboard section is published each month.

• The Detailed section is published quarterly.

Benefits

• This is very similar to the approach currently in place, and requires only an expansion/development of the reports.

• It is not as time or labour intensive to implement as processes are already in place.

Drawbacks

• Significant sections of the report will not be presented every month, and therefore some important performance results may be excluded from a proportion of the performance conversations.

Independent review of the HSPR and PMR

Determine the structure and reporting frequency for each section of the report, including the KPIs to be included in each section.

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 115: PwC Review

PwC

Report structure & content: Recommendation 2 (con’t) Determine the structure and reporting frequency for each section of the report, including the KPIs to be included in each section.

Page 115

April 2015

Option C: As per Option B, however certain PIs get reported more frequently if they fall below a certain threshold (which also “highlights” underperformance).

Features

• The only difference to Option B is that PIs from the Detailed Report which fall below a certain critical performance threshold will be included and highlighted in the Monthly Summary Report until performance is improved.

Benefits

• Monthly Reports do not lose focus on underperforming areas and keep it on the agenda at performance reviews.

• There is a clear link between the Summary Report and the Detailed Report.

Drawbacks

• It will be a more intensive process to implement this option than Option B, as it will require several changes to existing business rules and processes.

This review recommends Option C based on the benefits and drawbacks in comparison to other options.

Key recommended steps

1. Identify the suite of PIs to be included in the Summary Report.

2. Establish data sourcing frequencies to report against these PIs monthly.

3. Modify business processes to commence monthly data reporting and collection for the Summary Report.

4. In consultation with end users, design the Summary Report dashboard and layout of content.

5. In consultation with end users, design the Detailed Report dashboard and layout of content, including areas to flag consistent underperformance.

6. Establish business rules around Summary Report content for consistently underperforming areas (generally this should be a critical performance threshold).

For an example of how this could look, refer to Appendix H.

Independent review of the HSPR and PMR

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 116: PwC Review

PwC

Report structure & content: Recommendation 2 (con’t)

Page 116

April 2015

Additional observations – what NSW does well

• A good performance report should contain all the information necessary to inform a senior stakeholder of the current position of the facility and to initiate a chain of action where necessary. As such, timely information is critical.

• This is an approach taken by NSW who publish their full performance reports on a quarterly basis, with a dashboard released to inform of monthly PIs.

• In these cases, it is suggested it may be necessary to rely on appropriateness measures or trend, by using an indicative rating of performance on the basis of previous evidence.

Independent review of the HSPR and PMR

Determine the structure and reporting frequency for each section of the report, including the KPIs to be included in each section.

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 117: PwC Review

PwC

Report structure & content: Recommendation 3 Improve signposting and visual layout of HSPR dashboard (targets / thresholds / variance, clarity on time periods, trends, etc.).

Page 117

April 2015

Issue(s) addressed:

• Although the HSPR dashboard is visually impactful and clear to review in many ways, there are some issues with how it is presented:

• Lack of trend analysis

• Lack of target, actual, variance

• Some confusing labelling and signposting

• Lack of clear grouping into domains

Key recommended steps:

1. Introduction section added which explains each page and navigates the user through the report.

2. Clear signposting for each indicator which explains the precise timeframe which the results cover.

3. Include target, actual and variance information.

4. Group indicators into domains.

5. Include trend analysis (for full rolling previous year, unless specific reasons to use a different timeframe).

• This should be a simple “increasing or decreasing” trend arrow (more detail can be shown in subsequent pages).

6. Drill-down links provided:

• for domains to view further indicators.

• for specific indicators to view detailed trends and facility level information.

*Suggested Dashboard presentation (indicative Health Service

ratings shown)

Independent review of the HSPR and PMR

Reporting structure and content

*For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Domain 1

Indicator Frequency Target Actual Variance4 Periods

Previous

3 Periods

Previous

2 Periods

Previous

1 Period

Previous

Current

Period

Nov-14 Dec-14 Jan-15 Feb-15 Mar-15

9 12 15 15 16

Mar-14 Jun-14 Sep-14 Dec-14 Mar-15

10 8 8 9 9

Health Serv ice commentary

Performance results for this domain continue to exceed targets. This is demonstrated at the indicator level above.

Performance continues to improve for Indicator 1, and has remained steady for Indicator 2.

Performance results are reflective of improvements in X activ ity and the increased availability of data. X activ ity is

expected to improve which should continue to be reflected in performance results.

Department of Health commentary

1

Indicator 2 Quarterly 8 9 1

Indicator 1 Monthly 15 16

The example below demonstrates the content for Domain 1, and this format will be replicated for Domain 2.

DoH and Health Service commentary are at the domain level only.

Page 118: PwC Review

PwC

Reporting structure & content: Recommendation 4 & 5 Include data drill down facilities (from HSPR to PMR) and Health services to write HSPR commentary, highlighting reasons for performance and action plans.

Page 118

April 2015

Recommendation 4: Include data drill down facilities (particularly from HSPR to PMR).

Issue(s) addressed:

• The PMR currently includes a drill down facility for each indicator to sections which present further information, which stakeholders found useful. However, this is currently not available within the HSPR

• Combining the HSPR and the PMR will require links between the two to ensure that they can be navigated easily

Key recommended steps:

1. Provide the facility to drill down (electronically) from the HSPR to the PMR, including:

• for domains to view further indicators.

• for specific indicators to view detailed trends and facility level information.

2. Each indicator should also have drill-down into source data where feasible.

Recommendation 5: Health Services to write HSPR commentary, which explains the drivers and reasons for performance and potential performance improvement strategies.

Issue(s) addressed:

• The current content of the commentary does not explain the reasons behind performance levels and trends.

• This makes it difficult for a reader to assess the degree to which results represent a significant concern, or determine specific steps which can be taken to improve performance.

Key recommended steps:

1. Health Services – who should be best placed to understand the reasons behind their performance - should write commentary in the following sections of the HSPR:

1. Overview – commentary at the beginning of report which states the main performance messages for the month.

2. Alongside each indicator – This should focus on:

- Explaining the reasons behind under/over/changing performance (including specific drivers, e.g. particular facilities / specialties, etc.).

- Identify actions which are being taken or proposed to address any performance challenges.

2. DoH should continue to write commentary in each of the sections above – however this should be concise and focus on trends and highlighting key performance issues.

Independent review of the HSPR and PMR

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 119: PwC Review

PwC

Report structure & content: Recommendation 6 Combine the commentary and main HSPR and PMR reports.

Page 119

April 2015

Issue(s) addressed:

• The fact that the commentary and the reports are separate documents (circulated at different times) means that they are difficult to review effectively in conjunction.

• Additionally, some stakeholder feedback was received that respective distribution channels for the two documents were unclear, with some stakeholders receiving the HSPR but not the commentary.

Key recommended steps:

1. The HSPR should be restructured to provide space for the DoH and Health Services to write commentary in the Overview and alongside each indicator (see recommendation 4 on page 118).

2. The blank commentary should be circulated to the Health Services as part of the newly developed Monthly Summary Report.

3. DoH and the Health Services complete their commentary separately and concurrently.

Additional observations:

• This is a widely accepted practice shown within other jurisdictions and the UK, with other examples showing the visual analysis (graphs) and then the detailed commentary as shown to the right.

• Having the system manager and the Health Services write commentary simultaneously should limit any additional time.

NSW example of their combined commentary and analysis (see overleaf for more information)

UK Monitor reporting example of their combined commentary and analysis

Independent review of the HSPR and PMR

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 120: PwC Review

PwC

Report structure & content: Recommendation 6 (con’t) Combine the commentary and main HSPR and PMR reports.

Page 120

April 2015

Additional observations – what NSW does well

• NSW is shown to be the leading practice example in terms of structure and content.

• Each PI is provided space within the report, with the data shown graphically, and commentary provided that has the following successful attributes:

1. All graphs are recognised as a ‘figure’ with the commentary using this as a reference point. This is currently not used within the HSPR.

2. Trend data is discussed within the commentary.

3. Tools such as font size and colour are used to highlight the PI measure to clearly signpost the section. This provides a reference point and improves reader useability.

4. The commentary is less data descriptive and more aligned with reasons and justifications of the performance rating.

5. Additionally the PI results at times align back to other PIs, citing the relationship between a number of them:

• Another leading practice tool of the NSW report shown to the right is the use of summary pages at the start of each chapter to summarise the key messages. This does not show all PIs, however selects the key ones to highlight findings from. This is a tool currently used within the WA Annual Report.

• This is a quick reference point for readers and a visually powerful tool.

Independent review of the HSPR and PMR

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 121: PwC Review

PwC

Reporting structure & content: Recommendation 7 Include analysis on risk and mitigation strategies in the report.

Page 121

April 2015

Issue(s) addressed:

• There is currently no risk section included within the PMR or HSPR.

• WA Health has a clear risk framework which is actively managed by Health Services. However, stakeholder feedback has been that there is little discussion of risk (particularly clinical risk) between Health Services and DoH as part of performance discussions.

• A lack of focus on risk management – both by the organisations and external organisations – was identified as a key cause of the failures at Mid Staffordshire NHS Trust.

Key recommended steps:

1. Confirm the suite of PIs (see recommendations in section 3).

2. Define roles and responsibilities within DoH and the Health Services (see recommendations in section 6).

3. Define intervention mechanism (see recommendations in section 6).

4. Confirm key risks to performance based on consistent under performing facilities.

5. Assign custodian of risks and action plans.

6. Include a risk section as an appendix to the performance report, that is confidentially circulated to key internal stakeholders.

Additional observations – what the UK does well

• Whilst this is not a practice utilised widely within Australia, UK examples generally display a risk section on a domain basis.

• Research shows that reporting on risks is a key step towards risk management. Project Management Body of Knowledge standards recommends risk reporting to ensure risks are adequately identified and tracked, and to allow mitigation strategies to be developed to treat risks and minimise the likelihood of underperformance.

• Partially in reaction to the Public review of Mid Staffordshire, the implementation of risk sections within performance reporting is becoming best practice within the UK. The example shown is summarised per domain (financial depicted here) with key performance risks detailed in the second column and relevant actions shown in the last column.

• Crucially, the risk owner is also identified to increase accountability and ownership.

Independent review of the HSPR and PMR

Reporting structure and content

For a suggested specific layout recommendations to the ‘combined report’ please see Appendix H

Page 122: PwC Review

“Performance reporting is a means to an end, never an end in itself. The purpose of information is to promote

action.”

Source: NSW Spotlight on Performance Management Framework

Page 123: PwC Review

PwC

Focus area: Performance monitoring and evaluation

April 2015 Independent review of HSPR and PMR

Page 123

Page 124: PwC Review

PwC

Section Content Pages

Section 7

Performance monitoring and evaluation 123

7.1 What works well 125

7.2 Summary of issues and recommendations 127

7.3 Desktop review analysis 129

Key issue 1: Lack of clarity and documented structure of PME meetings. 130

Key issue 2: Lack of clarity of roles and responsibilities between Health Services and DoH staff in the PME process.

131

Key issue 3: Distribution list and process of the HSPR report is unclear and undocumented.

132

Key issue 4: No documented mechanism for highlighting and addressing key areas of underperformance.

133 – 134

7.4 Stakeholder feedback 135

7.5 Recommendations 138

Structure of this section

Page 124

April 2015

Performance Monitoring and Evaluation (PME)

Independent review of the HSPR and PMR

Page 125: PwC Review

7.1

What works well

Performance monitoring and evaluation

Page 126: PwC Review

PwC

Whilst the report focuses on identifying the key issues/improvements, a number of aspects work well in relation to the PME (processes).

.

PME (process) - what works well

Page 126

April 2015

There were a number of positive comments received through the desktop review and stakeholder engagement, in relation to PME, that are deemed to work well:

1. Reports are circulated to Health Services within a reasonable timeframe, with good collaboration with Information Development and Management Branch.

2. Meetings are held at senior level (DG and Health Services Chief Executive) where performance is actively discussed. Actions which arise from these meetings which are followed up

3. Stakeholders within DoH acknowledge that good relationships have been formed with data custodians at each of the Health Services.

4. The current PME process has quality assurance checks (including the requirement for Data Quality Statements) that provide Health Services the opportunity to validate and comment on data.

5. The online portal for the PMR provides an interactive way to view the report and drill-down into specific areas and performance measures.

There are sufficient opportunities for data validation and commentary to ensure the accuracy of the data as an indicator of performance.

The standardised and automated process (particularly for the PMR) is an effective way to minimise errors and streamline report production.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 127: PwC Review

7.2

Performance monitoring and evaluation - summary of key issues and recommendations

Performance monitoring and evaluation

Page 128: PwC Review

PwC

PME - summary of issues and recommendations The review identified a number of issues in relation to the PME, as summarised below, along with suggested recommendations for improvement.

Page 128

April 2015

1. Inconsistent and fragmented structure of PME meetings

2. No defined accountability

3. Distribution of the HSPR report is unclear and undocumented

4. No mechanism for highlighting consistent underperforming or sustained performance improvements

Performance monitoring and evaluation key issues

1. Implement a structured and consistent approach to discuss / address underperforming areas within PME meetings. See pages 139

2. Clarify and document roles and responsibilities in the PME process. See pages 140 – 141

3. Clarify (and expand) stakeholder lists and distribution channels for HSPR and PMR. See page 142

4. Develop an intervention system tied to domain composite score or specific indicator performance (depending on whether scoring option A or B chosen). See pages 143 – 144

Recommendations

1. Lack of clarity and documented structure of PME meetings. See page 130

2. Lack of clarity of roles and responsibilities between Health Services and DoH staff in the PME process. See page 131

3. The distribution list and process of the HSPR report is unclear and undocumented. See page 132

4. No documented mechanism for highlighting and addressing key areas of underperformance. See pages 133 – 134

PME key issues

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 129: PwC Review

7.3

Desktop review analysis

Performance monitoring and evaluation

Page 130: PwC Review

PwC

Current PME Processes

• The current PME process commenced following the introduction of the HSPR with the new report tabled at the Budget Steering Committee and discussed at SHEF and the monthly Board meeting.

• Although there are many advantages with the new process (see page 126), there are some issues which should be addressed:

− Limited documentation on how the HSPR and Health Service performance overall is discussed at these meetings, such as set agendas, terms of reference, etc..

− Limited time allocated within the meetings to discuss performance in detail

− The discussions focus largely around Health Services level performance as opposed to facility level

• Most of these issues are mainly due to the fact that the usage of the HSPR is still in a transitionary period, with precise processes and agendas for these meetings yet to be determined.

Key issue 1: PME processes There is limited documentation on how the HSPR and Health Service performance overall is discussed at current PME meetings

Page 130

April 2015

Previous PME Processes

• As discussed, the PME process was changed with the introduction of the HSPR monthly report. However, the previous PME process was well-structured and aligned with our leading practice research and good practice examples in other jurisdictions.

• Monthly performance review meetings were held between the Chief Executives of the Health Services, and DoH represented by senior management from:

− Safety & Quality (now Patient Safety and Clinical Quality)

− Performance (now part of Resourcing and Performance)

− Finance (now part of Resourcing and Performance)

• The meetings had a clear structure with a set agenda and time to discuss specific performance issues.

• Intervention levels were discussed and decided upon based on the Performance score, which set the actions for the Health Service based on performance. ‘Performance watch’ interventions also provided opportunity to monitor misrepresentation of performance or non-action, which protected the accuracy of results and reinforced accountability.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 131: PwC Review

PwC

Key issue 2: Unclear roles and responsibilities The current PME process does not clearly identify roles and responsibilities in associated documentation, leading to a lack of clarity on accountabilities.

Page 131

April 2015

Roles and responsibilities are lacking in the PMF

Roles and responsibilities are not clearly defined

• The roles and responsibilities towards report production are clearly defined in the Automation Roles and Responsibilities manuals for the HSPR and PMR.

• However, respective DoH / Health Services’ roles and responsibilities from a performance monitoring and evaluation perspective are not clearly defined or sufficiently detailed within the PMF.

• This raises the risk that there is confusion and mixed expectations between the DoH and Health Services, which could lead to disputes and certain key PME activities not being undertaken.

• A lack of clarity on where responsibilities and accountabilities lie also makes it difficult to undertake effective individual performance management.

State governance arrangements are detailed in the PMF, but there is insufficient detail about the specific roles and responsibilities of the DoH and the Health Services. As such, it is unclear what forums should hold performance conversations and who should be involved.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 132: PwC Review

PwC

Key issue 3: Communicating performance results Currently the HSPR and PMR results are not sufficiently well communicated throughout the Health Services.

Page 132

April 2015

Use of HSPR and PMR in Health Services’ internal meetings

• Research into leading practice suggests that effective performance meetings must have a clear purpose and structure to align with the principles of performance management within the Australian Public Sector Commission’s report ‘Strengthening the Performance Framework’ (APSC Report).

• Stakeholder feedback from the Health Services themselves indicates that they only use the HSPR and PMR to a limited degree internally to monitor performance. Instead, only some PIs from the PMR are used, and other measures that the Health Services can collect are included as deemed relevant.

• The consequence is that there is some lack of alignment between performance discussions being held between the DoH and Health Services, and performance discussions being held internally within Health Services.

Distribution of HSPR and PMR within Health Services

• The HSPR and PMR are produced to report on Health Service performance based on a suite of PIs and a defined approach to performance rating and overall scoring.

• Leading practice recommends a clear purpose and clarity for performance management to be effective, which must be driven by relevant performance data. As such, the HSPR and PMR should inform discussions around performance management.

• However, there is a lack of awareness in Health Services about the availability and content of the HSPR and PMR. Dissemination methods include the online portal (PMR only), emails, and newsletters.

• The distribution of the HSPR is not clearly defined and enforced. Different Health Services vary in the degree to which the report is circulated with their organisation. As the report contains various comparative information between Health Services, the degree to which the report can be widely distributed should be clarified.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 133: PwC Review

PwC

Key issue 4: Lack of clear intervention processes With the removal of the performance score from the HSPR, the intervention processes for Health Service performance is unclear.

Page 133

April 2015

DoH WA PMF 2013-14 intervention policy

x

Intervention

• leading practice research discussed previously demonstrates the importance of accountability in the performance management framework as a driver of high performance, coupled with a clear process for intervention if required.

• Previously there was a clear process for DoH intervention which was based on the performance score.

• However, this score has been removed from the HSPR. There is therefore a lack of clear documentation on how interventions are triggered under the current PME processes. This raises the following risks

• A significant deterioration in performance is not sufficiently addressed.

• Disputes between Health Services and DoH, as the criteria for intervention is not clearly defined and agreed.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 134: PwC Review

PwC

Key issue 4: Performance issues Current intervention level documentations are based on the performance scores, however this score is not discussed as part of the PME process.

Page 134

April 2015

DoH WA PMF 2013-14 intervention levels DoH WA PMF 2013-14 intervention criteria

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 135: PwC Review

7.4

Stakeholder feedback

Performance monitoring and evaluation

Page 136: PwC Review

PwC

Perception profiling – PME The scoring profiles and key issues raised at the workshops pertaining to the performance monitoring and evaluation processes are shown below.

Page 136

April 2015

• Two questions relating to PME processes were included within the workshops.

• Both DoH and Health Services perception of the PME were poor in relation to the questions asked, with most ‘somewhat disagreeing’ that roles and responsibilities were clearly defined and the process by which Health Services performance is monitored is clear.

• As shown overleaf this is reflective of the comments received with many unclear of the reporting review process.

• Additionally, stakeholders were unsure on who receives the reports. This was supported with a number of stakeholders not even aware of the HSPR commentary report, and unaware how to access the PMR.

• Stakeholders within the Health Services workshop also suggested the PME meetings were unstructured with a focus on Health Service comparison /overall performance as opposed to individual facilities and the reasons behind the rating.

1.00

2.00

3.00

4.00

5.00

The roles andresponsibilities for staffinvolved in the Health

Service PME process areclearly defined

The process by whichHealth Servicesperformance is

monitored is clear

The performance score iscalculated in a

transparent andunderstandable manner

Having an overallperformance score is anappropriate measure of

Health Serviceperformance

DoHAverage score

Health ServicesAverage score

It is noted that these views are indicative and only represent the views of the attendees. For full scoring profiles please refer to Appendix D

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 137: PwC Review

PwC

PME - key issues identified by stakeholders The below summarises the key issues raised by the workshops and the individual stakeholder interviews in relation to PME. Area Key issue Commentary

Performance monitoring and evaluation

Lack of documented and clear process for PME meetings

• The review process of the reports is unclear

• There is limited time allocated to discuss performance ratings

• Discussion is focused on the Health Service performance with limited discussion on individual facility performance

• Focus is on the performance rating with the HSPR not providing any evidence into the thresholds

Some views that the timeframes for review is insufficient

• Mixed views on whether the timeframe for review is sufficient – both within the DoH and Health Services

The HSPR distribution list is unclear

• It has not been clearly decided who should receive the HSPR, and how /where the results should be discussed

The PMR is not widely used as a performance monitoring tool

• A large number of stakeholders are currently not utilising the PMR

• Access to the report has significantly declined since the introduction of the HSPR

• The performance score is not widely used as an indication of facility performance due to limited access through the PMR

Page 137

April 2015 Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 138: PwC Review

7.5

Recommendations

Performance monitoring and evaluation

Page 139: PwC Review

PwC

PME: Recommendation 1 Implement a structured and consistent approach to discuss / address areas of underperformance within PME meetings.

Page 139

April 2015

Key issue(s):

• Previous to the introduction of the HSPR, there were structured performance review meetings between DoH and Health Services.

• Performance meetings are still held since the introduction of the HSPR, however these have no clear documented structure (for how the HSPR should be discussed).

Key recommended steps:

1. Determine who needs to be involved in performance review meetings – identify the audience.

2. Develop a structured meeting agenda, similar to what was used prior to the HSPR (see to the right).

3. Identify communication avenues between performance review meetings, to continue performance conversations as required.

Recommendation benefits:

• A structured approach towards PME meetings can help ensure that:

- Conversations are focused and all necessary areas of performance are covered.

- There is clear expectations and accountability for these discussions which stakeholders can prepare for.

- All Health Services are consistently measured, and that performance reviews and interventions are applied impartially.

Suggested meeting agenda

Agenda comments

In discussing actions to address underperformance in certain PIs, it should be acknowledged that improving one may have an adverse impact on another indicator in which the target is a two-way (+/-) range (for example improving NEST may cause an organisation to over perform against budgeted activity). In these circumstances, a balance may need to be struck between these conflicting priorities.

Additionally, when discussing action plans (see agenda item 8 above) it is noted some indicators are not fully in the control of the Health Services and, as such, the suite of actions that are decided on may not be exclusively the responsibly of the Health Service alone.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

For further leading practice research regarding PME processes used in other jurisdictions in addition to ‘best practice’ principles, please see Appendix G

The previous PME meeting process (prior to the HSPR) was more structured, including clear sections where performance (at indicator and facility level), interventions and actions were discussed. This is in line with our leading practice research and should be reintroduced.

Page 140: PwC Review

PwC

PME: Recommendation 2 Clarify and document roles and responsibilities in the PME process.

Page 140

April 2015

Issue(s) addressed:

• The current PMF does not clearly identify or sufficiently detail roles and responsibilities from a performance management perspective.

• The result of this is that engagement in performance conversations is limited, and there is no clear ownership of performance issues, which impairs accountability.

Key recommended steps:

1. Identify all stakeholders involved in the PME process.

2. Identify / agree roles and responsibilities for stakeholders, ensuring all duties are covered with no duplication.

3. Clearly document roles and responsibilities, using a template similar to QLD or NSW Health (see right for example).

4. Include roles and responsibilities in the PMF, PMR, and HSPR.

Recommendation benefits:

• Well-defined roles and responsibilities serve two purposes within the report:

1. To establish accountability

2. To ensure actions are addressed by the appropriate stakeholders.

• Using this approach helps to define the purpose of the reports for the audience, allowing the content to be specifically tailored and ensuring that results are communicated and discussed with the right stakeholders.

Additional observations - what QLD and NSW does well

QLD and NSW Health present good examples of designated and sufficiently defined roles and responsibilities. NSW Health also details performance review meetings and performance response meetings (see overleaf for further information).

For further leading practice research regarding PME processes used in other jurisdictions in addition to ‘best practice’ principles, please see Appendix G

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

Page 141: PwC Review

PwC

PME: Recommendation 2 (con’t) Clarify and document roles and responsibilities in the PME process.

Page 141

April 2015

Additional observations – what NSW does well

• NSW Health demonstrates leading practice though clear delegation and documentation of roles and responsibilities.

• The table to the right is taken from the NSW Health PMF. The table shows the specific roles and responsibilities of stakeholders involved in the reporting process.

• The WA PME should look to learn from the following leading practice attributes:

− A clear list of activities is defined

− Activities are then delegated accordingly

− The inclusion of timing to indicate the frequency and routine of each activity

− The clear stakeholder identification within the ‘Responsibility’ column promotes accountability and ensures that the right people are involved

• WA Health currently outlines roles and responsibilities in the Automation Roles and Responsibilities documents for the HSPR and PMR. However, roles and responsibilities around performance review meetings are only vaguely defined.

• It is recommended that a table similar to that used in the NSW Health PMF is included in the WA Health PMF. This would serve to clearly demonstrate roles, responsibilities, and timing for PME activities, and help to promote accountability.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

For further leading practice research regarding PME processes used in other jurisdictions in addition to ‘best practice’ principles, please see Appendix G

Page 142: PwC Review

PwC

PME: Recommendation 3 Clarify (and expand) stakeholder lists and distribution channels for HSPR and PMR.

Page 142

April 2015

Recommendation benefits:

• More focussed distribution alerts will ensure that all key stakeholders are:

− Aware of publication; and

− Aware that the report will be discussed at the upcoming BSC and SHEF ORC meetings.

• Furthermore, announcing the publication of each report more broadly will ensure that all key stakeholders are aware that the latest HSPR and PMR has been published, reducing the risk of using outdated information.

• It will also improve public accessibility of the report in line with transparency principles of the PMF.

Issue(s) addressed:

• Access to the HSPR is limited as distribution is primarily through draft report review meetings, and tabled at Budget Steering Committee and SHEF meetings.

• Additionally, some stakeholders have provided feedback that they are unclear how to access the PMR.

Key recommended steps:

1. Identify the key stakeholders who should receive a copy of the HSPR for performance management or decision-making purposes. Include alternate contacts in case of out-of-office notifications.

2. Create a distribution list of these key stakeholders.

3. Establish automated email notifications upon HSPR / PMR publication. Ensure these align with meeting requirements.

4. Include contact details for stakeholders to seek data clarification.

5. Identify broader distribution channels to announce publication to a wider audience (e.g. Health circulars, announcement alerts on WA Health intranet homepage).

6. List distribution channels in the HSPR / PMR for reference.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

For further leading practice research regarding PME processes used in other jurisdictions in addition to ‘best practice’ principles, please see Appendix G

Page 143: PwC Review

PwC

PME: Recommendation 4 Reintroduce ‘Intervention levels and performance watch’ that is triggered by performance rating instead of performance score.

Page 143

April 2015

Issue(s) addressed:

• The current PME process lacks rigorous and standardised intervention strategies to address underperformance.

• The previous PME process had well-defined intervention strategies that were triggered by the level of overall performance.

Key recommended steps:

1. Reintroduce the intervention strategies for discussion.

2. Consult with key stakeholders to review and update the intervention levels and strategies, in line with recommended changes around Health Services assessment and intervention (see page 144)

3. Clearly identify how intervention actions will be followed up, and how intervention levels will be escalated / de-escalated.

4. Develop action-plan templates for use by Health Services towards addressing performance issues.

5. Implement intervention strategies at performance review meetings (see recommendation 1) and follow-up at subsequent meetings.

Recommendation benefits:

• This process aligns with leading practice examples from other jurisdictions: see the examples to the right and overleaf.

• This approach will reintroduce accountability and ensure that Health Services are committed to improving service delivery by taking steps to improve performance.

NSW Health applies a decision logic for identification of performance issues through the use of individual KPI results.

VIC Health uses intervention levels based on the overall Performance Assessment score

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

For further leading practice research regarding PME processes used in other jurisdictions in addition to ‘best practice’ principles, please see Appendix G

Page 144: PwC Review

PwC

PME: Recommendation 4 (con’t) Reintroduce ‘Intervention levels and performance watch’ that is triggered by performance ratings instead of performance scores.

Page 144

April 2015

Additional observations – what NSW Health does well

• The figures shown to the right demonstrate the approach by NSW Health to intervention levels, based on performance ratings.

• Each PI is rated on performance. Following this, the assessment criteria is applied to determine the necessity of a performance review, as shown in Table 1. To this end, the PI ratings are treated as performance review triggers.

• The NSW approach also takes into account considerations pertaining to escalation or de-escalation, not just solely based on the PI performance rating alone, but also the agreed turnaround and / or recovery plans.

• The escalation process is then determined by the performance escalation levels (see example in Table 2). This details the point of escalation / de-escalation, the response, and the stakeholders involved.

• With the reintroduction of intervention levels, it is recommended that WA Health incorporates clear details of the intervention ‘gateways’ within the PMF, including:

− The performance triggers for intervention;

− The intensity of and process for performance monitoring for each intervention level;

− The roles and responsibilities of DoH, Health Services, and senior stakeholders in each level of intervention;

− The process for escalating / de-escalating interventions.

Independent review of the HSPR and PMR

Performance Monitoring and Evaluation (PME)

For further leading practice research regarding PME processes used in other jurisdictions in addition to ‘best practice’ principles, please see Appendix G

Page 145: PwC Review

“Assessment of performance should go beyond describing ‘what was done’ to providing systematic and rigorous information about ‘how well things were done’ ”.

Source: Advance Performance Management Institute Healthcare spotlight

Page 146: PwC Review

PwC

Appendices

April 2015 Independent review of HSPR and PMR

Page 146

Page 147: PwC Review

WA Performance management overview

Appendix A

Page 148: PwC Review

PwC

An indicative roadmap for drafting the PMF is shown below. These ‘milestones’ will be discussed later within the review to ascertain appropriateness.

Defining the objectives

What areas do you want to measure (KPQ)?

Defining the PIs

Develop heat map (WA example)

Strategic initiatives

WA PMF principles To undertake a review of performance management it is necessary to take a step back to understand the current state of the WA framework and reporting tools.

Domain Dimension

Effectiveness Access

Appropriateness

Quality

Efficiency Inputs per output unit

Equity Access

Sustainability Workforce

Facilities & equipment

Processes Coding

Finance

22 PIs (outcome measures) and 35 supporting Health Service measures

Performance Rating

Heat map

Highly performing

Performing

Underperforming

Not performing

Out of scope

Not calculable

Numbers withheld

Not available

Dr

ive

im

pr

ov

em

en

ts

Actions: PME

Balanced scorecard

Three components should work together to give a balanced scorecard.

1. Consolidate alignment with State and National policies

2. Enhance performance measurements 3. Expand public disclosure of performance 4. Improve performance reporting 5. Strengthen performance management 6. Foster workforce engagement.

Page 148

April 2015 Independent review of the HSPR and PMR

Page 149: PwC Review

PwC

WA PMR process map The diagram below describes the process undertaken to publish the PMR, highlighting the QA burden required.

Page 149

April 2015 Independent review of the HSPR and PMR

Page 150: PwC Review

Desktop review – PI additional analysis

Appendix B

Page 151: PwC Review

PwC

Comparison of finance indicators (1 of 4) The PIs below are used in other jurisdictions, and could be included in the WA PMR to represent a more balanced measure of financial performance.

Page 151

April 2015

Area Performance Indicator Description Target

Victoria

Operating result Operating result as a % of total operating revenue Health service specific

Creditors Trade creditor days 60 days

Debtors Patient debtor days 60 days

PP WIES Public and private Weighted Inlier Equivalent Separations activity performance to target

100%

Basic asset management plan Submission by health services of a basic asset management plan Full compliance

NSW

Expenditure matched to budget (General Fund)

a) Year to date - General Fund (%)

b) June projection – General fund (%) On budget or favourable

Own Source Revenue Matched to budget (General Fund)

a) Year to date - General Fund (%)

b) June projection – General fund (%) On budget or favourable

Recurrent Trade Creditors Recurrent Trade Creditors > 45 days correct and ready for payment ($) 0

Small Business Creditors Small Business Creditors paid within 30 days from receipt of a correctly rendered invoice (%)

100

Variation against purchased volume (%)

Acute Inpatient Services (NWAU) Unclear

Emergency Department Services (NWAU) Unclear

Sub and Non Acute Inpatient Services (NWAU) Unclear

Non Admitted Patient Services – Tier 2 Clinics (NWAU) Unclear

Mental Health Inpatient Activity Acute Inpatients (NWAU) Unclear

Mental Health Inpatient Activity Non Acute Inpatients (NWAU) Unclear

Mental Health Non Admitted occasions of service (Service Events) Unclear

Public Dental Clinical Service (DWAU) 100

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 152: PwC Review

PwC

Comparison of finance indicators(2 of 4) The PIs below are used in other jurisdictions and could be included in the PMR to present a more balanced measure of financial performance.

Page 152

April 2015

Area Performance Indicator Description Target

Queensland

Full-year forecast operating position The Hospital and Health Service (HHS) full-year forecast operating position

Balanced, surplus or an agreed non-recurrent deficit

Funded and average cost per WAU Year to date funded and cost per Jurisdiction-specific Weighted Activity Unit (WAU)

At or below the Hospital and Health Service specific funded price per WAU

Length of stay in public hospitals The average (mean) length of stay for a given Australian Refined Diagnosis Related Group (AR-DRG) for patients who stay one or more nights in hospital

At or below AR-DRG target

Patient Travel Subsidy Scheme (PTSS) reimbursements

Patient Travel Subsidy Scheme (PTSS) reimbursements 98% of claims paid within 30 days or less from invoice date

Area Performance Indicator

NHS TDA

Trust turnover rate

Temporary costs and overtime as % total paybill

Bottom line I&E position – Forecast compared to plan

Bottom line I&E position – Year to date actual compared to plan

Actual efficiency recurring/non-recurring compared to plan – Year to date actual compared to plan

Actual efficiency recurring/non-recurring compared to plan – Forecast compared to plan

Forecast underlying surplus/deficit compared to plan

Forecast year end charge to capital resource limit

Is the Trust forecasting permanent PDC for liquidity purposes?

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 153: PwC Review

PwC

Comparison of finance indicators (3 of 4) The PIs below are used in other jurisdictions and could be included in the PMR to present a more balanced measure of financial performance.

Page 153

April 2015

Similar PIs used in WA

Area Focus Area Performance Indicator

NHS Foundation Trust (Monitor)

Income and expenditure

Operating revenue for EBITDA

EBITDA

Net surplus

Net surplus after impairments and transfers by absorption

Revenue analysis

Ambulance

Community

Mental health

Elective in patients

Elective day cases

Outpatients

Non elective in patients

A&E

Maternity

Diagnostic tests and imaging

Critical care

Drugs revenue

Direct access and OP

Focus Area Performance Indicator

Revenue analysis continued

Unbundled chemotherapy

Unbundled beam radiotherapy

CQUIN revenue

NHS contract penalties or adjustments

Non NHS clinical revenues

Total clinical revenue

Total non-clinical revenue

Total operating revenue for EDITDA

Operating expenses

Year to date pay

Pay expense

Non pay expense

Total operating expenses for EDITDA

Cost improvement programs

Cost improvement programs as a % of operating expenditure:

Pay

Drugs

Clinical supplies

Non-clinical supplies

Total cost improvement programmes

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 154: PwC Review

PwC

Comparison of finance indicators (4 of 4) The PIs below are used in other jurisdictions and could be included in the PMR to present a more balanced measure of financial performance.

Page 154

April 2015

Area Focus Area Performance Indicator

NHS Foundation Trust (Monitor)

EBITDA margin

EBITDA %: teaching acute

EBITDA %: large acute

EBITDA %: medium acute

EBITDA %: small acute

EBITDA %: mental health

EBITDA %: specialist

EBITDA %: ambulance

‘S’ curve YTD surplus / deficit margin by trust

Capital expenditure Actual capital expenditure by type

Capital expenditure as % of depreciation

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 155: PwC Review

PwC

Comparison of workforce indicators The PIs below are used in other jurisdictions and could be included in the PMR to represent a more balanced measure of performance.

Page 155

April 2015

Area Performance Indicator Description Target

NSW Performance reviews Staff who have had a performance review (%) 100

Area Performance measure

NSW Workplace injuries (%)

Premium staff usage - average paid hours per FTE (Hours):

• Medical

• Nursing

Reduction in the number of employees with accrued annual leave balances of more than 30 days

Recruitment: improvement on baseline average time taken from request to recruit to decision to approve/decline recruitment (days)

Aboriginal Workforce as a proportion of total workforce

Your say survey:

• Estimated Response Rate

• Engagement Index

• Workplace Culture Index

NSW also collects the following measures from Hospitals and Health Services, but these measures do not have specified targets.

Area Performance Indicator

UK NHS TDA

NHS Staff Survey: Percentage of staff who would recommend the trust as a place of work

NHS Staff Survey: Percentage of staff who would recommend the trust as a place to receive treatment

Trust level total sickness rate

Total trust vacancy rate

Percentage of staff with annual appraisal

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 156: PwC Review

PwC

Comparison of patient experience indicators (1 of 3) The PIs below are used in other jurisdictions and could be included in the WA PMR to show an indication of patient care.

Page 156

April 2015

Area Performance Indicator Description Target

Victoria

Healthcare Experience Survey Participation in the healthcare experience survey Full

compliance

Surgical site infection Submission of infection surveillance data for nominated surgical procedures No outliers

ICU central-line infection Submission of infection surveillance data for ICU central lines. No outliers

SAB Staphylococcus aureus bacteraemia (SAB) rate per occupied bed day ≤ 2/10,000

Postnatal care % of women who have given birth and on discharge have been offered prearranged postnatal care

100%

Newborn Screening % of eligible newborns screened for hearing deficit before one month of age ≥ 97%

MH28Day % of adult general acute psychiatric inpatients readmitted within 28 days of Separation 14%

Post discharge % of mental health patients with a post-discharge follow-up within seven days (child and adolescent, adult, aged)

75%

Seclusion Rate of total mental health seclusions (child and adolescent, adult, aged) ≤ 15/1,000

Patient satisfaction % of patients satisfied or very satisfied with quality of care provided by Paramedics 95%

Pain reduction – Adult % of adult patients experiencing severe cardiac and traumatic pain whose level of pain is reduced significantly

90%

Pain reduction – paediatric % of paediatric patients experiencing severe traumatic pain whose level of pain is reduced significantly.

90%

Stroke patients transported % of adult patients suspected of having a stroke who were transported to a stroke unit with thrombolysis facilities within 60 Minutes

80%

Cardiac survival on hospital discharge

% of adult VF/VT cardiac patients surviving to hospital discharge 20%

People Matter patient safety culture

% score on patient safety culture section of People Matter survey 80%

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 157: PwC Review

PwC

Comparison of patient experience indicators (2 of 3) The PIs below are used in the other jurisdictions and could be included in the WA PMR to show an indication of patient care.

Page 157

April 2015

Area Performance Indicator Description Target

NSW

Incorrect procedures Incorrect procedures: Operating Theatre- resulting in death or major loss of function (number)

0

CLAB infections ICU Central Line Associated Bloodstream (CLAB) Infections (number)

0

SA-BSI Staphylococcus aureus bloodstream infections (per 10,000 occupied bed days)

≤ 2/10,000

28-day mental health readmissions Mental Health: Acute readmission within 28 days (%) 13%

Post discharge mental health care Mental Health: Acute Post-Discharge Community Care - follow up within seven days (%)

70%

Queensland

Measures of patient experience Measures of patient experience with Maternity services and Small hospitals

Under development

In hospital mortality VLAD indicators In hospital mortality rates for Acute myocardial infarction, Stroke, Fractured neck of femur, and Pneumonia

Upper level flags or no lower level

flags

Unplanned Hospital Readmission VLAD Indicators

Unplanned hospital readmission rates for patients discharged following management of Acute myocardial infarction, Heart failure, Knee replacements, Hip replacements, Depression, Schizophrenia, Paediatric Tonsillectomy and adenoidectomy

Upper level flags or no lower level

flags

Healthcare–associated infections Healthcare associated staphylococcus aureus (including MRSA) bacteraemia

≤ 2/10,000

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 158: PwC Review

PwC

Comparison of patient experience indicators (3 of 3) The PIs below are used in other jurisdictions and could be included in the WA PMF to represent a more balanced measure of performance.

Page 158

April 2015

Area Performance Indicator Description

UK NHS (Outcome framework)

Patient experience of primary care

i GP services

ii GP out-of-hours services

iii NHS dental services

Patient experience of hospital care N/A

Friends and family test N/A

Patient experience characterised as poor or worse: Primary care

Hospital care

Improving people’s experience of outpatient care Patient experience of outpatient services

Improving hospitals’ responsiveness to personal needs Responsiveness to in-patients’ personal needs

Improving people’s experience of accident and emergency services

Patient experience of A&E services

Improving access to primary care services Access to i GP services and ii NHS dental services

Improving women and their families’ experience of maternity services

Women’s experience of maternity services

Improving the experience of care for people at the end of their lives

Bereaved carers’ views on the quality of care in the last 3 months of life

Improving experience of healthcare for people with mental illness

Patient experience of community mental health services

Improving children and young people’s experience of healthcare

Children and young people’s experience of inpatient services

Improving people’s experience of integrated care People’s experience of integrated care (ASCOF 3E**)

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 159: PwC Review

PwC

Comparison of quality indicators (1 of 3) The PIs below are used in other jurisdictions and could be included in the WA PMF to represent a more balanced measure of performance.

Page 159

April 2015

Area Performance Indicator

UK NHS Contract Conditions Framework

Percentage of admitted Service Users starting treatment within a maximum of 18 weeks from Referral

Percentage of non-admitted Service Users starting treatment within a maximum of 18 weeks from Referral

Percentage of Service Users on incomplete RTT pathways (yet to start treatment) waiting no more than 18 weeks from Referral

Percentage of Service Users waiting less than 6 weeks from Referral for a diagnostic test

Percentage of A & E attendances where the Service User was admitted, transferred or discharged within 4 hours of their arrival at an A&E department

Percentage of Service Users referred urgently with suspected cancer by a GP waiting no more than two weeks for first outpatient appointment

Percentage of Service Users referred urgently with breast symptoms (where cancer was not initially suspected) waiting no more than two weeks for first outpatient appointment

Percentage of Service Users waiting no more than one month (31 days) from diagnosis to first definitive treatment for all cancers

Percentage of Service Users waiting no more than 31 days for subsequent treatment where that treatment is surgery

Percentage of Service Users waiting no more than 31 days for subsequent treatment where that treatment is an anti-cancer drug regimen

Percentage of Service Users waiting no more than 31 days for subsequent treatment where the treatment is a course of radiotherapy

Percentage of Service Users waiting no more than two months (62 days) from urgent GP referral to first definitive treatment for cancer

Percentage of Service Users waiting no more than 62 days from referral from an NHS screening service to first definitive treatment for all cancers

Percentage of Service Users waiting no more than 62 days for first definitive treatment following a consultant’s decision to upgrade the priority of the Service User (all cancers)

Percentage of Category A Red 1 ambulance calls resulting in an emergency response arriving within 8 minutes

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 160: PwC Review

PwC

Comparison of quality indicators (2 of 3) The PIs below are used in other jurisdictions and could be included in the WA PMF to represent a more balanced measure of performance.

Page 160

April 2015

Area Performance Indicator

UK NHS Contract Conditions Framework

Percentage of Category A Red 2 ambulance calls resulting in an emergency response arriving within 8 minutes

Percentage of Category A calls resulting in an ambulance arriving at the scene within 19 minutes

Sleeping Accommodation Breach

All Service Users who have operations cancelled, on or after the day of admission (including the day of surgery), for non-clinical reasons to be offered another binding date within 28 days, or the Service User’s treatment to be funded at the time and hospital of the Service User’s choice

Care Programme Approach (CPA): The percentage of Service Users under adult mental illness specialties on CPA who were followed up within 7 days of discharge from psychiatric in-patient care

Zero tolerance MRSA

Minimise rates of Clostridium difficile

Zero tolerance RTT waits over 52 weeks for incomplete pathways

All handovers between ambulance and A & E must take place within 15 minutes with none waiting more than 30 minutes

All handovers between ambulance and A & E must take place within 15 minutes with none waiting more than 60 minutes

Following handover between ambulance and A & E, ambulance crew should be ready to accept new calls within 15 minutes

Following handover between ambulance and A & E, ambulance crew should be ready to accept new calls within 15 minutes

Trolley waits in A&E not longer than 12 hours

No urgent operation should be cancelled for a second time VTE risk assessment: all inpatient Service Users undergoing risk assessment for VTE, as defined in Contract Technical Guidance

Publication of Formulary

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 161: PwC Review

PwC

Comparison of quality indicators (3 of 3) The PIs below are used in other jurisdictions and could be included in the WA PMF to represent a more balanced measure of performance.

Page 161

April 2015

Area Performance Indicator

UK NHS Contract Conditions Framework

Duty of candour

Completion of a valid NHS Number field in mental health and acute commissioning data sets submitted via SUS, as defined in Contract Technical Guidance

Completion of a valid NHS Number field in A&E commissioning data sets submitted via SUS, as defined in Contract Technical Guidance

Completion of Mental Health Minimum Data Set ethnicity coding for all detained and informal Service Users, as defined in Contract Technical Guidance

Completion of IAPT Minimum Data Set outcome data for all appropriate Service Users, as defined in Contract Technical Guidance

Similar PIs used in PMR Similar PIs used in HSPR and PMR

Independent review of the HSPR and PMR

Page 162: PwC Review

PwC

The following indicators are key indicator types which are tracked in other jurisdictions, however are not included in the HSPR.

Page 162

April 2015

HSPR PIs

Area Indicator type Example indicator descriptions Rationale for indicator Similar

indicator in PMR

Safety & Quality

Ambulance handover

Handover time between Ambulance and ED

• Key quality indicator measured in other jurisdictions Main ambulatory PI which is controllable by Health Services

N

Safety & Quality

Incorrect procedures / complications

Incorrect procedures: Operating Theatre- resulting in death or major loss of function (number), number of complications arising from procedures

• Key quality indicator measured in other jurisdictions,

• Significant cause of harm / death in patients N

Safety & Quality

Mental health

Care Programme Approach (CPA): The percentage of Service Users under adult mental illness specialties on CPA who were followed up within 7 days of discharge from psychiatric in-patient care

• Key mental health quality indicator measured in other jurisdictions

Y

Safety & Quality

Survival rates Survival rates for key conditions (e.g. stroke, cardiac)

• Key quality indicator measured in other jurisdictions

• Key Annual Report requirement N

Safety & Quality

Serious incidents reporting

Serious Incidents followed up within set timeframe

• Key "lead" quality indicator measured in other jurisdictions

• Deficiency shown to be a key indicator of poor quality services (e.g. - Mid Staffordshire)

Y

Safety & Quality

VTE risk assessment

Proportion of inpatient Service Users undergoing risk assessment for VTE

• Key quality indicator measured in other jurisdictions,

• Significant cause of harm / death in patients N

Independent review of the HSPR and PMR

Page 163: PwC Review

PwC

The following indicators are key indicator types which are tracked in other jurisdictions, however are not included in the HSPR.

Page 163

April 2015

HSPR PIs (con’t)

Area Indicator type Example indicator descriptions Rationale for indicator Similar

indicator in PMR

Patient Experience

Patient survey Satisfaction rates, participation rates

• Key "lead" quality indicator measured in other jurisdictions

• Deficiency shown to be a key indicator of poor quality services (e.g. - Mid Staffordshire)

Y

Patient Experience

Patient complaints

Number of complaints, number of complaints addressed within set timeframe

• Key "lead" quality indicator measured in other jurisdictions

• High number of complaints, and/or failure to address, can be a key warning of poor quality management (e.g. - Mid Staffordshire)

N

Workforce Staff appraisals Percentage of staff with annual appraisal / performance review

• Training / staff performance management shown to be key component in managing a high quality service

N

Workforce Staff survey Percentage of staff who would recommend organisation to staff and patients

• Key "lead" indicator of quality of service, and also acts as a forecast of future staff turnover and/or dissatisfaction/deteriorating performance

N

Workforce Staff complaints Number of complaints, number of complaints addressed within set timeframe

• Key "lead" quality indicator measured in other jurisdictions

• High number of complaints, and/or failure to address, can be a key warning of poor quality management (e.g. - Mid Staffordshire)

N

Access Diagnostic tests Percentage of Service Users waiting less than 6 weeks from Referral for a diagnostic test

• Access to diagnostic tests can have a significant impact on health outcomes for patients

N

Independent review of the HSPR and PMR

Page 164: PwC Review

PwC

The following indicators are key indicator types which are tracked in other jurisdictions, however are not included in the HSPR.

Page 164

April 2015

HSPR PIs (con’t)

Area Indicator type Example indicator descriptions

Rationale for indicator Similar

indicator in PMR

Finance / Efficiency

Capital spend Capital spend to plan, forecast capital spend to plan

• Capital spend is a key reporting requirement to Treasury and as part of the Annual Report, and is currently not tracked in the HSPR or PMR

N

Finance / Efficiency

Length of stay

Overall length of stay, multi-day length of stay, daycase rate, pre-admission length of stay, day-of-surgery-admission rate

• Key indicator of efficiency - perhaps the biggest driver of hospital costs

• High LOS has also been shown to be detrimental to patient care

Y

Finance / Efficiency

Net cost of services

Net cost of services to plan

• Provides a holistic view of financial performance (more so than ABF Unit Cost to Price)

• Key reporting requirement to Treasury and the Annual Report

Y

Finance / Efficiency

Staff levels / costs

FTE to plan, staff costs to plan, Nursing hours per patient day

• Main driver of cost in all Health Services • Significant reductions in staffing levels can signal an

increased risk to patient care Y

Finance / Efficiency

Savings schemes YTD savings delivered to plan

• Key indicator of actions being taken by Health Services to address financial challenges (NB: requires scrutiny and sense-checking to validate results)

• Key reporting requirement to Treasury

N

Finance / Efficiency

Private patient revenue

Private patient revenue to plan • As a key focus area for WA Health, this indicator should be

tracked specifically in the HSPR N

Finance / Efficiency

Forecast

Forecast net cost of surplus to plan, forecast cost to plan, forecast revenue to plan, forecast savings to plan

• Forecasts are a critical element of all leading practice financial reporting, and are central to the monitoring process in other jurisdictions

• Forecasts are key reporting requirements to Treasury on a monthly basis

N

Independent review of the HSPR and PMR

Page 165: PwC Review

PwC

PI alignment to national standards (1 of 2) The table below shows the alignment between the PIs in the PMR and existing national targets and standards.

Page 165

April 2015

Domain code

Performance measure Source

EA1 Proportion of emergency department patients seen within recommended times NHRA

EA2 NEAT % of ED Attendances with LOE <=4 hours NPA

EA3 Average overdue wait time of elective surgery cases waiting beyond the clinically recommended time, by urgency category

NPA

EA4 Elective surgery patients treated within boundary times NPA

EA5 Percentage of selected elective cancer surgery cases treated within boundary time NHRPAF

EAP1 Rate of selected potentially preventable chronic condition hospitalisations (for specified chronic conditions)

WA Chronic Health Conditions Framework 2011-2016 WA Health Promotion Strategic Framework 2012-2016

EQ1 Age-adjusted rate (AAR) of avoidable deaths No source linkage

EQ3 Staphylococcus aureus bacteraemia infections per 10,000 patient days NHRPAF

EQ5 Hospital standardised mortality ratio ACSQHC

EQ7 Death in low-mortality DRGs NHRPAF

EQ8 In hospital mortality rates (for acute myocardial infarction, stroke, fractured neck of femur & pneumonia)

ACSQHC

EQ10 Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit

Fourth National Mental Health Plan

There is no alignment with national targets or standards in 5 of the 22 (23%) PMR PIs.

NPA: National Partnership Agreement ACSQHC: Australian Commission on Safety and Quality in Health Care

NHRA: National Health Reform Agreement NHRPAF: National Health Reform Performance and Accountability Framework

Independent review of the HSPR and PMR

Page 166: PwC Review

PwC

PI alignment to national standards (2 of 2) The table below shows the alignment between the PIs in the PMR and existing national targets and standards.

Page 166

April 2015

Domain code

Performance measure Source

EI1 Volume of weighted activity year-to-date No source linkage

EI3 Average cost per test panel for PathWest No source linkage

EI6 YTD distance of net cost of service to budget NHRPAF

EI8 Ratio of actual cost of specified public hospital services compared with the ‘state efficient price’ NHRPAF

EQA1 Standardised Mortality Ratio (SMR) of deaths among Aboriginal children (0-4 years) and non-Aboriginal children (0-4 years)

NPA

EQA4 Proportion of eligible population receiving dental services from subsidised dental programs by group

NPA

SW3 Staff turnover No source linkage

PC2 Percentage of cases coded and available for reporting within Department of Health WA Operation Directive 0137/08

PF2 Manually corrected payroll errors (underpayments) HCN Service Level Agreement

PF3 Availability of Information Communication Technology (ICT) services: percentage of Service calls

No source linkage

Independent review of the HSPR and PMR

NPA: National Partnership Agreement ACSQHC: Australian Commission on Safety and Quality in Health Care

NHRA: National Health Reform Agreement NHRPAF: National Health Reform Performance and Accountability Framework

There is no alignment with national targets or standards in 5 of the 22 (23%) PMR PIs.

Page 167: PwC Review

PwC

PI alignment to national standards The table below shows the alignment between the PIs in the HSPR and existing national targets and standards.

Page 167

April 2015

Domain code

Performance measure Source

EA1

Proportion of emergency department patients not seen within recommended times NPA

EA2 NEAT % of ED Attendances with LOE <=4 hours NPA

EA3 Average overdue wait time of elective surgery cases waiting beyond the clinically recommended time, by urgency category

NPA

EA4 Elective surgery patients treated within boundary times NPA

EA5 Percentage of selected elective cancer surgery cases treated within boundary time NHRPAF

EQ3 Staphylococcus aureus bacteraemia infections per 10,000 patient days NHRPAF

EQ7 Death in low-mortality DRGs NHRPAF

EQ10 Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit Fourth National Mental Health Plan

EQ14 Hand Hygiene Compliance National benchmark

EI1 Volume of weighted activity year-to-date Established for HSPR

EI8 Ratio of actual cost of specified public hospital services compared with the ‘state efficient price’ Established for HSPR

EI4 YTD Distance of Expenditure to Budget Established for HSPR

EI5 YTD Distance of Own Sourced Revenue to Budget Established for HSPR

N/A YTD controlled own source revenue to budget Established for HSPR

There is no alignment with national targets or standards in 5 of the 14 (36%) HSPR PIs.

Independent review of the HSPR and PMR

NPA: National Partnership Agreement

NHRPAF: National Health Reform Performance and Accountability Framework

Page 168: PwC Review

PwC

Lead and lag indicators (1 of 4) The table below shows the full suite of PIs categorised as lead or lag indicators.

Page 168

April 2015

Domain code

Performance measure Lead (P)

or Lag (R)

EA1 Proportion of emergency department patients seen within recommended times R

EA2 NEAT % of ED Attendances with LOE <=4 hours R

EA3 Average overdue wait time of elective surgery cases waiting beyond the clinically recommended time, by urgency category R

EA4 Elective surgery patients treated within boundary times R

EA5 Percentage of selected elective cancer surgery cases treated within boundary time R

EA7 Percentage of ED Mental Health patients admitted within 8 hrs R

EA8 Theatre activity R

EA10 Access Block R

EA11 Admissions from ED R

EA12 Percentage of SJAA patients with Off Stretcher time within 20 minutes R

EAP1 Rate of selected potentially preventable chronic condition hospitalisations (for specified chronic conditions) R

EAP2 Adult immunisation: percentage of people aged 65 years and over immunised against Influenza P

EAP3 Obesity: percentage of population who are overweight or obese: a) Adults b) Children

P

EAP4 Tobacco: percentage of adults who are current smokers P

Independent review of the HSPR and PMR

Page 169: PwC Review

PwC

Lead and lag indicators (2 of 4) The table below shows the full suite of PIs categorised as lead or lag indicators.

Page 169

April 2015

Domain code

Performance measure Lead (P)

or Lag (R)

EQ1 Age-adjusted rate (AAR) of avoidable deaths R

EQ3 Staphylococcus aureus bacteraemia infections per 10,000 patient days R

EQ5 Hospital standardised mortality ratio R

EQ7 Death in low-mortality DRGs R

EQ8 In hospital mortality rates (for acute myocardial infarction, stroke, fractured neck of femur & pneumonia) R

EQ10 Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit R

EQ2 Percentage of Emergency Department Attendances which are unplanned re-attendances in less than or equal to 48 hours of previous attendance.

R

EQ4 Rate of Severity Assessment Code (SAC) 1 clinical incident investigation reports received by Patient Safety Surveillance Unit within 45 working days of the event notification date

R

EQ6 Hospital accreditation R

EQ9 Unplanned hospital readmissions of patients discharged following management of (knee replacement, hip replacement, tonsillectomy & adenoidectomy, hysterectomy, prostatectomy, cataract surgery and appendicectomy)

R

EQ12 Rate of community follow up within first 7 days of discharge from psychiatric admission R

EQ13 Measures of patient experience (including satisfaction) with hospital services P

EQ14 Hand Hygiene Compliance R

Independent review of the HSPR and PMR

Page 170: PwC Review

PwC

Lead and lag indicators (3 of 4) The table below shows the full suite of PIs categorised as lead or lag indicators.

Page 170

April 2015

Domain code

Performance measure Lead (P)

or Lag (R)

EI1 Volume of weighted activity year-to-date R

EI3 Average cost per test panel for PathWest R

EI6 YTD distance of net cost of service to budget R

EI8 Ratio of actual cost of specified public hospital services compared with the ‘state efficient price’ R

EI2 Elective surgery day of surgery admission rates R

EI4 YTD Distance of Expenditure to Budget R

EI5 TD Distance of Own Sourced Revenue to Budget R

EI7 School Dental Service ratio of examinations to enrolments R

EI9 Number of separations (unweighted) R

EI10 Coded acute multiday average length of stay R

EI11 YTD Distance of Salaries Expenditure to Budget R

EQA1 Standardised Mortality Ratio (SMR) of deaths among Aboriginal children (0-4 years) and non-Aboriginal children (0-4 years) R

EQA4 Proportion of eligible population receiving dental services from subsidised dental programs by group R

EQA2 Standardised Rate Ratio of Hospitalisations of : a) Aboriginal People compared to non-Aboriginal People b) Aboriginal children (0-4 years) compared to non-Aboriginal children (0-4 years)

R

EQA3 Childhood immunisation: percentage of children fully immunised at 12-15 months: a) Aboriginal b) Total

P

EQA5 WA Health Aboriginal employment headcount R

Independent review of the HSPR and PMR

Page 171: PwC Review

PwC

Lead and lag indicators (4 of 4) The table below shows the full suite of PIs categorised as lead or lag indicators.

Page 171

April 2015

Domain code

Performance measure Lead (P)

or Lag (R)

SW3 Staff turnover R

SW1 Proportion of medical graduates (and other categories of medical staff) to total medical staff a) Interns (graduate) b) Resident Medical Officers c) Registrars d) Consultants e) Other

R

SW2 Proportion of nursing graduates (and other categories of nursing staff) to total nursing staff changed! a) Graduate b) Junior c) Experienced d) Senior e) SRN and above f) Other

R

SW4 Injury management: a) Lost time injury severity rate b) Percentage of managers and supervisors trained in occupational safety and health (OSH) and injury management responsibilities

R

SW5 Leave Liability P

SW6 Actual and Budget FTE R

PC2 Percentage of cases coded and available for reporting within R

PC1 Percentage of cases coded by end of month closing date R

PC3 DRG Accuracy in Clinical Information Audit Program R

PF2 Manually corrected payroll errors (underpayments) R

PF3 Availability of Information Communication Technology (ICT) services: percentage of Service calls R

PF1 Patient fee debtors R

PF4 NurseWest shifts filled R

PF5 Accounts payable - payment within terms R

Independent review of the HSPR and PMR

Page 172: PwC Review

PwC

Reporting frequencies in the PMR (1 of 4) The table below shows the reporting frequencies within the PMR.

Page 172

April 2015

In the PMR, 51% of data is sourced monthly.

Domain code

Performance measure Freq

EA1 Proportion of emergency department patients seen within recommended times M

EA2 NEAT % of ED Attendances with LOE <=4 hours M

EA3 Average overdue wait time of elective surgery cases waiting beyond the clinically recommended time, by urgency category M

EA4 Elective surgery patients treated within boundary times M

EA5 Percentage of selected elective cancer surgery cases treated within boundary time Q

EA7 Percentage of ED Mental Health patients admitted within 8 hrs M

EA8 Theatre activity M

EA10 Access Block M

EA11 Admissions from ED M

EA12 Percentage of SJAA patients with Off Stretcher time within 20 minutes M

EAP1 Rate of selected potentially preventable chronic condition hospitalisations (for specified chronic conditions) A

EAP2 Adult immunisation: percentage of people aged 65 years and over immunised against Influenza A

EAP3 Obesity: percentage of population who are overweight or obese: a) Adults b) Children

A

EAP4 Tobacco: percentage of adults who are current smokers A

M: Monthly Q: Quarterly T: Tri-annually 6: Six-monthly A: Annually

Independent review of the HSPR and PMR

Page 173: PwC Review

PwC

Reporting frequencies in the PMR (2 of 4) The table below shows the reporting frequencies within the PMR.

Page 173

April 2015

Domain code

Performance measure Freq

EQ1 Age-adjusted rate (AAR) of avoidable deaths A

EQ3 Staphylococcus aureus bacteraemia infections per 10,000 patient days A

EQ5 Hospital standardised mortality ratio A

EQ7 Death in low-mortality DRGs A

EQ8 In hospital mortality rates (for acute myocardial infarction, stroke, fractured neck of femur & pneumonia) A

EQ10 Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit Q

EQ2 Percentage of Emergency Department Attendances which are unplanned re-attendances in less than or equal to 48 hours of previous attendance.

M

EQ4 Rate of Severity Assessment Code (SAC) 1 clinical incident investigation reports received by Patient Safety Surveillance Unit within 45 working days of the event notification date

Q

EQ6 Hospital accreditation A

EQ9 Unplanned hospital readmissions of patients discharged following management of (knee replacement, hip replacement, tonsillectomy & adenoidectomy, hysterectomy, prostatectomy, cataract surgery and appendicectomy)

Q

EQ12 Rate of community follow up within first 7 days of discharge from psychiatric admission Q

EQ13 Measures of patient experience (including satisfaction) with hospital services A

EQ14 Hand Hygiene Compliance T

Independent review of the HSPR and PMR

M: Monthly Q: Quarterly T: Tri-annually 6: Six-monthly A: Annually

In the PMR, 51% of data is sourced monthly.

Page 174: PwC Review

PwC

Reporting frequencies in the PMR (3 of 4) The table below shows the reporting frequencies within the PMR.

Page 174

April 2015 Independent review of the HSPR and PMR

M: Monthly Q: Quarterly T: Tri-annually 6: Six-monthly A: Annually

Domain code

Performance measure Freq

EI1 Volume of weighted activity year-to-date M

EI3 Average cost per test panel for PathWest M

EI6 YTD distance of net cost of service to budget M

EI8 Ratio of actual cost of specified public hospital services compared with the ‘state efficient price’ A

EI2 Elective surgery day of surgery admission rates M

EI4 YTD Distance of Expenditure to Budget M

EI5 TD Distance of Own Sourced Revenue to Budget M

EI7 School Dental Service ratio of examinations to enrolments A

EI9 Number of separations (unweighted) M

EI10 Coded acute multiday average length of stay Q

EI11 YTD Distance of Salaries Expenditure to Budget M

EQA1 Standardised Mortality Ratio (SMR) of deaths among Aboriginal children (0-4 years) and non-Aboriginal children (0-4 years) A

EQA4 Proportion of eligible population receiving dental services from subsidised dental programs by group Q

EQA2 Standardised Rate Ratio of Hospitalisations of : a) Aboriginal People compared to non-Aboriginal People. b) Aboriginal children (0-4 years) compared to non-Aboriginal children (0-4 years)

A

EQA3 Childhood immunisation: percentage of children fully immunised at 12-15 months: a) Aboriginal b) Total

Q

EQA5 WA Health Aboriginal employment headcount M

In the PMR, 51% of data is sourced monthly.

Page 175: PwC Review

PwC

Reporting frequencies in the PMR (4 of 4) The table below shows the reporting frequencies within the PMR.

Page 175

April 2015 Independent review of the HSPR and PMR

M: Monthly Q: Quarterly T: Tri-annually 6: Six-monthly A: Annually

Domain code

Performance measure Freq

SW3 Staff turnover M

SW1 Proportion of medical graduates (and other categories of medical staff) to total medical staff a) Interns (graduate) b) Resident Medical Officers c) Registrars d) Consultants e) Other

Q

SW2 Proportion of nursing graduates (and other categories of nursing staff) to total nursing staff a) Graduate b) Junior c) Experienced d) Senior e) SRN and above f) Other

Q

SW4 Injury management: a) Lost time injury severity rate b) Percentage of managers and supervisors trained in occupational safety and health (OSH) and injury management responsibilities

6

SW5 Leave Liability M

SW6 Actual and Budget FTE M

PC2 Percentage of cases coded and available for reporting within M

PC1 Percentage of cases coded by end of month closing date M

PC3 DRG Accuracy in Clinical Information Audit Program Q

PF2 Manually corrected payroll errors (underpayments) M

PF3 Availability of Information Communication Technology (ICT) services: percentage of Service calls M

PF1 Patient fee debtors M

PF4 NurseWest shifts filled M

PF5 Accounts payable - payment within terms M

In the PMR, 51% of data is sourced monthly.

Page 176: PwC Review

PwC

Reporting frequencies in the HSPR The table below shows the reporting frequencies within the HSPR.

Page 176

April 2015

In the HSPR, 52% of data is sourced monthly.

Domain code

Performance measure Freq

EA1 Proportion of emergency department patients not seen within recommended times M

EA2 NEAT % of ED Attendances with LOE <=4 hours M

EA3 Average overdue wait time of elective surgery cases waiting beyond the clinically recommended time, by urgency category M

EA4 Elective surgery patients treated within boundary times M

EA5 Percentage of selected elective cancer surgery cases treated within boundary time Q

EQ3 Staphylococcus aureus bacteraemia infections per 10,000 patient days A

EQ7 Death in low-mortality DRGs A

EQ10 Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit Q

EQ14 Hand Hygiene Compliance T

EI1 Volume of weighted activity year-to-date M

EI8 Ratio of actual cost of specified public hospital services compared with the ‘state efficient price’ A

EI4 YTD Distance of Expenditure to Budget M

EI5 YTD Distance of Own Sourced Revenue to Budget M

N/A YTD controlled own source revenue to budget M

Independent review of the HSPR and PMR

M: Monthly Q: Quarterly T: Tri-annually 6: Six-monthly A: Annually

Page 177: PwC Review

Stakeholder and interview attendee list

Appendix C

Page 178: PwC Review

PwC

Workshop participants The table below lists the participants for each of the three workshops.

Page 178

April 2015

Area Invitees Position title Attended

Workshop 1

Department of Health

Andrew Joseph A/Group Director, Resources Y

Brendon McMullen Y

Colin D'Cunha Manager, Workforce Modelling & Data Y

Gerard Montague A/Director, Health System Economic Modelling Y

Karen Lennon Assistant Director, Business Intelligence & Information Strategy Y

Karen Lopez A/Director, Performance Y

Lexie Morton Y

Olly Campbell A/Executive Director, Patient Safety and Clinical Quality Y

Peter May Director, Financial Reform Y

Robert Kleinfelder Assistant Director, Performance Reporting Y

Tim Reid A/Group Director, Performance Y

Tony Satti A/Director, Data Integrity Y

Workshop 2

CAHS Dayle Bryant A/Executive Director, Finance & Business, CAHS N

Debbie Bryan Executive Director, Governance & Performance, CAHS Y

NMHS Alain St Flour Executive Director, Finance, NMHS Y

Sandra Miller Executive Director, Safety Quality & Finance, NMHS N

SMHS

Adam Lloyd Director, Business Intelligence, SMHS Y

Aresh Anwar A/Executive Director, Royal Perth Group, SMHS N

Colin Holland Group General Manager, Finance & Performance, SMHS N

WACHS Jordan Kelly A/Executive Director, Corporate Services, WACHS Y

Tim O’Brien A/Director, Business & Performance Analysis, WACHS Y

Workshop 3

Department of Health

Andrew Joseph A/Group Director, Resources, Resourcing & Performance Y

Graeme Jones Group Director, Finance, Resourcing & Performance N

Tim Reid A/Group Director, Performance Y

CAHS Debbie Bryan Executive Director, Governance & Performance, CAHS Y

NMHS Alain St Flour Executive Director, Finance, NMHS Y

SMHS Colin Holland Group General Manager, Finance & Performance, SMHS N (Diana Carlson proxy)

WACHS Jordan Kelly A/Executive Director, Corporate Services, WACHS N

Independent review of the HSPR and PMR

Page 179: PwC Review

PwC

Interview participants The table below lists the participants for the interviews that were conducted.

Page 179

April 2015

Date Attendees Position title

3-Mar Frank Daly A/Chief Executive, South Metropolitan Health Service

10-Mar Graeme Jones Group Director, Finance

11-Mar David Russell-Weisz Chief Executive, Fiona Stanley Hospital

12-Mar Shane Kelly Chief Executive, North Metropolitan Health Service

Alain St Flour Executive Director, Finance, North Metropolitan Health Service

Sandra Miller Executive Director, Safety Quality & Performance, North Metropolitan Health Service

16-Mar Rebecca Brown Deputy Director General

17-Mar Jeff Moffet Chief Executive Officer, WA Country Health Service

Jordan Kelly A/Executive Director, Corporate Services, WA Country Health Service

23-Mar Angela Kelly A/Executive Director, Resourcing and Performance Division

2-Apr Philip Aylward Chief Executive, Child and Adolescent Health Service

Independent review of the HSPR and PMR

Page 180: PwC Review

Stakeholder interview and workshop scoring

Appendix D

Page 181: PwC Review

PwC

Workshop responses The table below lists the workshop participants’ responses for each of the questions.

Page 181

April 2015

Questions Workshop 1 Workshop 2

scoring scoring

Performance indicators 1 2 3 4 5 Total 1 2 3 4 5 Total

The suite of PIs with the PMR clearly align to the strategic objectives of WA Health 3 3 2 8 2 3 5

The PIs within the PMR are comprehensive, and do not leave any significant area of performance unmeasured

1 1 5 1 8 2 3 5

The suite of PIs with the HSPR clearly align to the strategic objectives of WA Health 1 4 2 1 8 1 2 2 5

The PIs within the HSPR are comprehensive, and do not leave any significant area of performance unmeasured

3 1 1 2 7 2 2 4

The targets and thresholds align to realistic and sufficiently challenging expectations of Health Service performance

3 2 3 1 9 3 1 1 5

Data sources from which PIs are calculated are clear and accessible to Health Services, and capable of drill down for further analysis

1 1 4 2 8 4 1 5

Data used is accurate and comparable across organisations 1 1 6 8 3 1 1 5

Data used is as timely as it needs to be 3 1 3 1 8 3 2 5

Reporting layout and content 1 2 3 4 5 Total 1 2 3 4 5 Total

The PMR report’s contains sufficient content and detail to present a full and balanced picture of performance

1 1 4 2 8 1 2 2 5

The structure and layout of the PMR report is clear and easy to understand 2 4 2 8 2 2 1 5

The HSPR report’s contains sufficient content and detail to present a full and balanced picture of performance

1 2 2 3 8 2 3 5

The structure and layout of the HSPR report is clear and easy to understand 1 4 2 1 8 1 2 2 5

There is sufficient value in producing both reports in their current format 2 3 3 8 4 1 5

Performance monitoring and evaluation 1 2 3 4 5 Total 1 2 3 4 5 Total

The roles and responsibilities for staff involved in the Health Service PME process are clearly defined

2 2 5 1 10 2 3 5

The process by which Health Services performance is monitored is clear 2 3 3 2 10 2 2 1 5

The performance score is calculated in a transparent and understandable manner 1 2 1 5 1 10 3 2 5

Having an overall performance score is an appropriate measure of Health Service performance

1 2 7 10 3 2 5

Independent review of the HSPR and PMR

Page 182: PwC Review

Stakeholder interview and workshop feedback

Appendix E

Page 183: PwC Review

PwC

DoH feedback - Performance indicators (PMR)

What works well Key issues / opportunities

Issue /opportunity Root causes

1. Each PMR indicator is placed through a rigorous and consultative process to define targets

2. National standards are used in the majority to set the targets

3. A reasonably comprehensive set of indicators is included

1. The PMR does not include sufficient indicators around finance processes as opposed to outcomes

2. The indicators are more heavily weighted towards measuring finance rather than patient care

(Above comments were from two different team discussions and highlight the difference in opinion)

3. Not every DoH member is made aware of the existence of the PMR, with a large number having no knowledge on the PMR PIs to provide an informed opinion

4. PMR indicators are marked as confidential, but are then published

5. There is a confusion over how the PIs are calculated and how that influences the Performance scoring

6. There are numerous indicators tracked within the PMR that are also measured within other reports (duplicating work)

7. Insufficient number of PIs measuring patient experience

8. Insufficient number of PIs measuring workforce metrics

9. Some of the indicators and / or targets are different between the HSPR and PMR

1. There is some perception that there has been a lack of consultation with Finance and Workforce departments in setting the indicators

Page 183

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 184: PwC Review

PwC

DoH feedback - Performance indicators (HSPR)

What works well Key issues / opportunities

Issue /opportunity Root causes

1. DoH team members have greater awareness of the HSPR indicators than the PMR indicators

1. Numerous changes to the PIs measured making comparisons over the long term / trend analysis difficult

2. Hand hygiene indicator is manually collected and based on very small sample numbers

3. Some lack of clarity on how the targets have been set for certain indicators

4. Some of the target thresholds may be unrealistic

5. Insufficient numbers of indicators on Finance processes

6. Grouping of three ED category indicators makes it difficult to ascertain how each category is performing

7. Missing some key indicators relating to Safety & Quality (eg patient experience, unplanned return to Theatres, ICU access and investigations into adverse events)

8. The Death in low mortality DRGs is sourced externally, has a significant time lag, and is difficult to drill down into

9. With many of the PIs measured less than monthly, concerns were raised over the appropriateness of inclusion of these PIs

10. Would question the benefit of having a two-way activity threshold target, and addressing variances can conflict with other targets (eg reducing activity can adversely affect NEST)

11. There is a lack of key lead indicators (rather than lag) in the HSPR

1. There was limited consultation on determining the 14 PIs to be included

2. There has been no formal process to align the PIs to the four strategic pillars of WA Health

3. The process for reviewing and amending these PIs is unclear

4. No consultation was held to set targets

5. There is no standardised approach to defining HSPR targets

Page 184

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 185: PwC Review

PwC

DoH feedback - PMR content and layout

What works well Key issues / opportunities

Issue /opportunity Root causes

1. The PMR contains detailed information around individual facility performance

1. Timing to complete the reports is tight

2. Whilst the PMR is drafted monthly, it is only published once a quarter due to complex publication requirements – whilst data for the indicators within the PMR is prepared monthly (for the Governing Councils report), it is only published once a quarter due to the resources being utilised in producing the HSPR

3. There is lack of understanding among some stakeholders in DoH and Health Services around the existence and use of the report

1. There is no drill down available in the summary report, due to the report being published in PDF

2. Both reports are hard to read and understand in black and white

3. Limited commentary explaining reason for performance rating (good or bad)

4. The layout is confusing with the structure difficult to follow

5. Limited cohesion between the two reports (HSPR and PMR)– both present different information which makes it difficult to know which one to go to get required data

Page 185

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 186: PwC Review

PwC

DoH feedback - HSPR content and layout

What works well Key issues / opportunities

Issue /opportunity Root causes

1. Participants who had read the commentary felt it provided a good level of detail with many using this to source information

2. Whilst it has issues, many acknowledged the HSPR was a good tool when compared to what was available previously

1. DoH write the commentary for HSPR which may be inappropriate. However, DoH has previously tried to direct Health Services to write their own commentary but this was unsuccessful ( due principally to commentary not being updated frequently)

1. Commentary which accompanies reports do not appear to be frequently read

2. Commentary produced is very fact based – therefore may be of limited value

3. Some views that the ranking of Health Services may be inappropriate

4. Concerns raised as to why the commentary and main report were separate

5. Limited circulation of the HSPR is an issue – only the Chief Executives and FPPG receive it officially

6. Mixed views on the value of producing both the HSPR and PMR

7. Lack of trend analysis in HSPR

Page 186

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 187: PwC Review

PwC

DoH feedback - PME

What works well Key issues / opportunities

Issue /opportunity Root causes

1. Senior executives now regularly review and discuss the HSPR report

1. The introduction of the HSPR to some extent meant that focus has shifted away from the PMR

1. There is some limited understanding of how the performance score is calculated

2. Giving equal ratings to each of the performance indicators in calculating the overall performance score may not always be appropriate

3. Some lack of clarity on how the performance score is / should be used

4. Website hits have shown the PMR reports are not used as widely as a tool to report on performance undermining their value - Since the introduction of the HSPR, website hits have shown that the PMR is not used as widely as a tool to report on performance undermining their value

5. Time spent discussing poor performing Health Services within the meetings is limited- meaning there is inadequate time to drill down into reasons

6. Performance is discussed at Health Service level not a facility level

7. Mixed views on the appropriateness of ranking and performance scoring the Health Services

Page 187

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 188: PwC Review

PwC

Health Services - PMR Performance indicators (1 of 2)

What works well Key issues / opportunities

Issue /opportunity Root causes

• The PMR gives a comprehensive indication of performance

• No significant data accuracy issues with the reporting process as data extraction is automatic. Accuracy is more of an issue at the operational, data input stage.

• Currently too many service and quality indicators are in fact access type metrics

• The PIs do not clearly align to each organisation’s objectives

• The PIs do not recognise each individual organisations potential targets and special circumstances

• The performance indicators are focused on hospital performance and lack attention to:

• Country health

• Allied health

• Mental health

• Community health

• WACH’s is having to produce their own reports as PMR is not reflective of their performance needs.

• Unclear what the purpose /objectives of the different reports are

• There is limited alignment to the WA Health four strategic pillars and the Annual Report

• Most indicators are lag performance indicators, rather than lead performance indicators

• Some HSPR and PMR indicators are calculated differently, leading to confusion and lack of confidence in data

• There is some perception that indicators were developed with only adult patients in mind, not child patients

• There is some perception that there is limited communication when PI targets and definitions change

• People lose interest in indicators that are updated less frequently than monthly because the data is considered irrelevant

Page 188

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 189: PwC Review

PwC

Health Services - PMR Performance indicators (2 of 2)

What works well Key issues / opportunities

Issue /opportunity Root causes

• There are gaps in the suite of PIs measured including:

− Safety & Quality: key indicators missing such as incident investigation, patient experience, complication rates, ambulatory care

− Productivity: key indicators missing such as NHPPD, LOS, etc.

− Finance: need to align to governmental reporting (e.g. Expenditure/Revenue to budget, net cost of services, staffing costs, capital, etc.)

− Workforce: no indicators at all

− Non-hospital metrics: very few included (e.g. Mental Health, Ambulatory Care, Community, Prevention, Diagnostic, etc.)

• Health Services find it difficult to drill into the data driving the results, and identify the causes

• Health Services do not receive the backing data

• Some targets are unrealistic (e.g. SABSI - half the national target, and 0 rate targets)

• The calculation methodology of some indicators (such as the unit cost to price) is confusing

Page 189

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 190: PwC Review

PwC

Health Services - HSPR - Performance indicators

What works well Key issues / opportunities

Issue /opportunity Root causes

• HSPR indicators offer a broad reflection of performance

• HSPR does not include some key workforce indicators and some efficiency measures that are included in the PMR

• Timeliness issues with the data, both in relation to the delay in some of the information being reported (some over 6 months old), and the fact that some of the data is reported only quarterly / annually

• Most indicators are lag performance indicators, rather than lead performance indicators

• Some targets are unrealistic (e.g. SABSI - half the national target, and zero rate targets)

• Some of the targets change between months with no notice

The gaps identified within the PMR PI suite also apply to HSPR

• There is some lack of understanding on how the metrics are calculated and analysed

• There is poor communication when PI targets change

• There is poor communication when PI definitions change

• People lose interest in indicators that are updated less frequently than monthly because the data is considered irrelevant

Page 190

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 191: PwC Review

PwC

Health Services - PMR content and layout

What works well Key issues / opportunities

Issue /opportunity Root causes

• Online access allows for drill down functionality

• The PMR is accessible to anyone and allows any person to drill down to hospital level

• The trend arrows are useful in highlighting the direction of performance

• PMR is only published quarterly

• The presentation of the scorecard in the PMR is visually complex

• PMR does not align to organisational structure

• The PMR is used at governing councils, otherwise is not frequently used

Only CAHS uses the PMR at Governing Councils

• The HSPR and PMR do not include information tracking clinical risk

• The functionality of the PMR is needed as it provides the drill down, and the HSPR’s summary structure is easier to review. However, general feeling that there was insufficient value in producing both reports, and perhaps they could be combined in some way. HSPR and PMR duplicate with other reports (e.g. Whole of Health Dashboard, Whole of Health Summary on Financial Performance, WA Health Operational Plan Reporting, A/DG's Performance Assessment Reporting)

Page 191

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 192: PwC Review

PwC

Health Services - HSPR content and layout

What works well Key issues / opportunities

Issue /opportunity Root causes

• The layout (visually) is good, and much better than the PMR

• The site based comparison sheets are also very useful

• Current performance rating scoring methods are not clearly understood by all

• No trend analysis is shown within the HSPR

• Lack of “target, actual, variance” display

• There is too much focus on Health Services’ comparison in the HSPR and not enough on individual facility performance

• Current commentary is very fact based, and does not explain root causes behind the performance

• Having the commentary published in a separate report to the main report makes it difficult to review in conjunction with the data

• Commentary is produced by the DoH, rather than by those who are responsible for the performance

• The ABF Dashboard and Governor Council report duplicate a number of performance indicators

Page 192

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 193: PwC Review

PwC

Health Services – PME

What works well Key issues / opportunities

Issue /opportunity Root causes

• The process for reviewing performance changed with the introduction of the HSPR, and has not been embedded in yet

• There is limited understanding on how the performance scoring is calculated

• There is limited understanding on how the performance rating is calculated

• Most feedback received (although not all) was that having a performance score is inappropriate, as it does not provide a sufficient view of Health Service Performance

• Different views on whether performance scores are an appropriate basis for intervention

• Having a performance score meant that there was a shift in focus to improving the performance score, rather than improving overall performance

• Lack of clarity around the who the report is / should be sent to

• Limited structure in current performance meetings, and limited time to discuss the information

• Time to review the information is very tight

Page 193

April 2015 Independent review of the HSPR and PMR

*Feedback received after comments were circulated are highlighted in red

Page 194: PwC Review

Assessment criteria

Appendix F

Page 195: PwC Review

Assessment criteria

Performance indicators

Page 196: PwC Review

PwC

Assessment criteria – PIs (1 of 4) The review focussed on assessing against the below criteria.

Page 196

April 2015

Sub -section

Performance Indictors

Leading Practice Principle

PWC assessment Ref

pages Key Issues

PMR /HSPR meets

criteria

1

Aligned Performance indicators align to the purpose of the organisation

Performance indicators (PI) measure the achievement of the organisation’s strategic objectives

Desktop review findings: There is limited alignment in both the HSPR and PMR to the strategic objectives outlined within the Annual report, the PMF, or the WA Health Four Pillars (defined within the strategic intent).

Stakeholder comments: Stakeholders identified that a root cause of some of the issues outlined in workshops was that there has been no formal process to align the PIs to the four strategic pillars of WA Health.

40 – 41 1 u

Targets and thresholds are robust and align with national standards. Considerations include past performance, the performance of other similar authorities, and the resources available.

Desktop review findings: Whilst the majority of the PMR indicators do align with national standards and peer comparisons, only a third of HSPR demonstrate alignment to national standards. In addition, the rationale for some of the targets which do not align to national standards is unclear.

Stakeholder comments: There was a wide view amongst the stakeholders that they understood how the PMR PIs were set, but there was limited transparency within the HSPR PIs. Further comments were made as to the process behind changing the targets and the difficulty with these being changed during the reporting period (one year.)

42 – 44 2 u

Performance indicators

Independent review of the HSPR and PMR

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 197: PwC Review

PwC

Assessment criteria – PIs (2 of 4)

Page 197

April 2015

Sub -section

Performance Indictors

Leading Practice Principle

PWC assessment Ref pages Key

Issues

PMR /HSPR meets

criteria

2

Relevant cont. Information is relevant to the target audience and sufficiently detailed to inform decision-making.

PIs measured are actionable with the results easily reproducible by the Health Services

Desktop review findings: Most of the indicators within the HSPR and PMR relate to areas of performance which the Health Services can control, in part or fully. However, the PI results are not always easily reproducible by the Health Services as for certain PI’s they are not easily able to access source data . Stakeholder comments: There is confusion over how the PIs are calculated and how they influence performance scoring. There are numerous indicators tracked within the PMR that are also measured within other reports (duplicating work for Health Services).

51 6 u

The PIs and how they are calculated are sufficiently clear and easy to understand

Desktop review findings: The calculations used are sufficiently clear and easy to understand. The method of calculation, and units of measure used are all comparable to calculation methods used in other jurisdictions, and should be easily replicated by facilitates. Stakeholder comments: Most of the calculations of the indicators are easy to understand – the main exception being the unit cost to price, and the composite scoring for some of the HSPR indicators (e.g. ED access). For the later, the lack of access in this report to the sub-indicators which make up the composite score makes analysing this more difficult.

42 – 44 2 4

The data source from which PIs are calculated are clear and accessible to Health Services, and capable of drill down for further analysis

Desktop review findings: Discussions with the Health Services highlighted data availability as a real issue. When the initial findings within the reports are circulated, Health Services are then required to undertake their own analysis to understand / replicate the rating in order to then justify it. At present, Health Services are unable to access data provided by external sources which is used at DoH to measure performance, having to seek a number of executive approvals for access. This was suggested to take too long, and as such, they cannot replicate performance ratings as they do not have access to the full suite of data on their own performance. Stakeholder comments: There is no drill down available in the PMR summary report, due to the report being published in PDF. There is no ability to drill down into externally provided data, due to access restrictions.

51 6

5

Performance indicators

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 198: PwC Review

PwC

Assessment criteria – PIs (3 of 4)

Page 198

April 2015

Sub -section

Performance

Indictors

Leading Practice Principle

PWC assessment Ref

pages Key Issues

PMR /HSPR meets

criteria

3

Timely Information is readily available and accessible, and does not delay decision-making.

Results against the PIs performance can be collected and reported in a timely manner

Desktop review findings: With nearly half of the PIs included within the HSPR measured less than monthly, including these indicators does not add value to the HSPR as the performance ratings cannot change until updated data is available. As such, this increases the administrative burden of producing the reports unnecessarily. Whilst less of an issue, a significant proportion of PIs within the PMR are also updated infrequently. Including these PIs not only undermines the scoring but due to the lag in time updating these measures are potentially reported too late to action. Stakeholder comments: There are timeliness issues with the data, both in relation to the delay in some of the information being reported (some over 6 months old), and the fact that some of the data is reported only quarterly / annually.

49 – 50

5 5

PIs measure recent performance with an appropriate level of frequency

Desktop review findings: The release the HSPR and completion of the PMR (however not published) monthly is deemed appropriate as performance needs to be flagged as soon as possible to ensure risks are mitigated and resolved. However, many of the PIs are only calculated quarterly or annually which can reduce the indicators successful use as a driver of performance. Additionally, 45% of the indicators within the HSPR are updated less than monthly. Almost half of the report cannot act as a driver of performance for proportions of the year. Stakeholder comments: With many of the PIs measured less than monthly, concerns were raised over the appropriateness of inclusion of these PIs. People lose interest in the indicators that are updated less than monthly because the data is considered irrelevant.

49 – 50 5 5

Accurate Information is accurate and consistent to be a trustworthy indicator of performance.

The PIs statistical calculation methodologies are robust and not open to misinterpretation / miscalculation

Desktop review findings: The methodology used to calculate performance rating and performance scoring is simple and easy to understand. However, a number of the indicator targets and calculation methodologies are slightly different between the HSPR and the PMR. Stakeholder comments: Some HSPR and PMR indicators are calculated differently, leading to confusion and lack of confidence in data. It then becomes unclear as to what the purpose /objectives of the different reports are. 42 – 44 2 u

Performance indicators

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 199: PwC Review

PwC

Assessment criteria – PIs (4 of 4)

Page 199

April 2015

Sub -section

Performance Indictors

Leading Practice Principle

PWC assessment Ref

pages Key Issues

PMR /HSPR meets

criteria

4

Accurate cont. Information is accurate and consistent to be a trustworthy indicator of performance.

PIs and how they are measured are comparable across organisations

Desktop review findings: Units of measure used for the suite of PIs are the same for each Health Service.

4

Complete and comprehensive Information provides a complete and comprehensive representation of performance.

The suite of PIs cover all necessary measures to assess that area of performance

Desktop review findings: In comparison to other jurisdictions, the HSPR lacks a number of key indicators of performance in the following areas: • Safety & Quality • Patient experience • Workforce • Proactive ‘lead as opposed to lag’ indicators The PMR has a number of essential PIs, however there are some which could be more appropriately reported elsewhere, to focus attention between Health Services and DoH on the key issues. Stakeholder comments: There are many gaps in the suite of PIs. In particular the workforce area lacks assessment of any indicators. In addition, too many service and quality indicators are in fact access indicators.

45 – 47

48

3 and

4 5

The number of PIs is not overly excessive

Desktop review findings: The PIs included in the WA framework is far in excess of other jurisdictions (54 PIs are employed in the WA framework, with a maximum of 38 in all other jurisdictions). Although the PIs are potentially excessive, there is a suggestion additional necessary metrics may need to be created or use existing metrics in aligning strategic objectives to the PIs. Stakeholder comments: There are numerous indicators tracked within the PMR that are also measured in other reports creating (duplication of PIs is excessive).

48 4 u

Performance indicators

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 200: PwC Review

Assessment criteria

Performance scoring

Page 201: PwC Review

PwC

Assessment criteria –performance scoring

Page 201

April 2015

Sub -section

Performance scoring

Leading Practice Principle

PWC assessment Ref

pages Key Issues

PMR /HSPR meets

criteria

1

scoring and performance

rating

Is the method of scoring

performance appropriate

and understandable

The Performance score uses statistical validity and an appropriate weighting of performance for all contributing PIs

Desktop review findings: Priority objectives may not be accurately reflected because all PIs in the PMR are equally weighted to calculate the Performance score. Most other jurisdictions incorporate weightings into their performance score to ensure priority indicators remain a focus. For example, priority indicators award additional points for achieving performance. Applying weightings to PIs supports the link between performance measures and strategic objectives, and provides incentive for service providers to apply concerted effort in line with strategic direction. Stakeholder comments: Giving equal ratings to each of the performance indicators in calculating the overall performance score may not always be appropriate.

76 5 u

The Performance score is calculated in a transparent and understandable manner

Desktop review findings: The performance score calculation methodology is simple and easy to replicate. However, explanations of this are only contained in the Performance score Methodology document. It is not clear whether the Performance score methodology was developed with key stakeholders, and how widely and regularly this is communicated. Clear communication of the Performance Ratings and overall Performance score calculation will facilitate better stakeholder understanding. Having two types of performance rating in the HSPR can risk causing confusion to users. Stakeholder comments: There is limited understanding among Health Services staff in particular on how the performance scoring and performance ratings are calculated.

72

73

2 and

3 4

The Performance score is an appropriate measure of overall Health Service Performance

Desktop review findings: Having an overall performance score can mask indicators / facilities which are having significant performance issues, as the focus is drawn to the overall score trend. Performance scores can also adversely influence behaviours in focusing management corrective action on indicators / facilities which are close to the boundary to a higher rating (and therefore a smaller improvement can have a large impact on the score), rather than on areas of performance which are in the most need of performance improvement. Stakeholder comments: There are different views on whether performance scores are an appropriate basis for intervention. Significant feedback that having a performance score means that there is a shift in focus to improving the performance score, rather than improving overall performance.

71

74 – 75

1 and

4 5

Performance scoring and rating

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 202: PwC Review

Assessment criteria

Reporting structure and content

Page 203: PwC Review

PwC

Assessment criteria – reporting (1 of 2)

Page 203

April 2015

Sub -

section

Reporting structure

and content

Leading Practice Principle

PWC assessment Ref

pages Key

Issues

PMR /HSPR meets

criteria

1

Structure

of the report is logical, and

easy to understand and analyse

Reports are structured in a logical way that facilitates clear understanding of the data presented

• Desktop review findings (PMR): Both the monthly summary (now released quarterly) and the full data backup of the PMR provide a lot of detail. The PMR back up detailed reports are structured within each Health Service and provide all collated data on a per facility basis citing trends and movements from previous months. However, there is a significant lack of clear signposting throughout.

• Stakeholder comments (PMR): The presentation of the scorecard in the PMR is visually complex, however, the trend arrows are useful in highlighting the direction of performance. Additionally, there is insufficient value in producing both reports.

• Desktop review findings (HSPR): The front dashboard is simple and impactful, and the colour coding draws the user (to an extent) towards key areas of good and poor performance. There is some issues in relation to some lack of clear labelling and signposting – and in particular having the commentary separate to the main report makes the full picture difficult to review.

• Stakeholder comments (HSPR): The HSPR is a great tool and progress on what was there before. There is some confusion in relation the timeframe and the thresholds that the indicator results relate to.

96 – 100

101 – 106

1 and

2 u

Reports appropriately highlight particular areas of strong / weak performance

• Desktop review findings: The HSPR colour coding does, to an extent, draw users towards key areas of strong / weak performance. However, although targets are shown, the thresholds for each of the performance ratings are not, making it sometimes difficult to assess how strong / weak performance is. The facility performance breakdown sheets do help to highlight the specific strong / weak performance of different facilities. The PMR dashboard does highlight strong / poor performance, but some of the detailed sheets do not guide the user clearly to specific performance issues.

• Stakeholder comments: The PMR contains detailed information around individual facility performance . However, no trend analysis is shown within the HSPR and there is a lack of “target, actual, variance” display.

96 – 100

101 – 106

108

1

2 and

4

u

2 Content

The reports contain trends and forecasts of performance

• Desktop review findings: Some trend data is shown within the PMR, through the scorecard and the detailed sections. There are no trends or forecasts shown within the HSPR. Whilst some trend commentary is provided, no visual aids citing performance movements is shown.

• Stakeholder comments: The PMR trend arrows are useful in highlighting the direction of performance. However, the lack of trend analysis in HSPR does not provide a clear idea of direction in performance.

96 – 100

101 – 106

1 and

2 u

Reporting structure and content

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 204: PwC Review

PwC

Assessment criteria – reporting (2 of 2)

Page 204

April 2015

Sub -

section

Reporting structure

and content

Leading Practice Principle

PWC assessment Ref

pages Key

Issues

PMR /HSPR meets

criteria

2

Content

Report content provides a

complete and comprehensive representation

of performance.

Sufficient and meaningful commentary accompanies the reports to present a full and balanced picture of performance

• Desktop review findings (PMR): Commentary provided within the PMR is summarised to a page of stand alone commentary at the front, which is difficult to analyse in conjunction of the results themselves. Commentary often mentions specific PIs only through the code (i.e. EA3), requiring users to look up the title in a separate definitions document.

• Desktop review findings (HSPR): The current content of the commentary does not explain the reasons behind performance levels and trends. This makes it difficult for a reader to assess the degree to which results represent a significant concern, or determine specific steps which can be taken to improve performance.

• Stakeholder comments (HSPR & PMR): Commentary which is separate to the reports is not frequently read. Additionally, the commentary is very fact based and does not explain root causes behind performance.

96 – 100

101 – 106

1 and

2 5

Reports identify the key risks to performance

• Desktop review findings: Key risks to performance are not explicitly referenced, however, with the underlying tone being any PI with underperformance highlighted, the facility is at risk in terms of performance. Additionally, the commentary does not include any associated risks with the non achievement of this targets and any actions that need to be or have been taken.

• Stakeholder comments: The lack of a risk section in the reports was highlighted as a key concern, in particular as there is an apparent lack of regular and comprehensive discussion of key risks between DoH and Health Services.

108 4 5

The HSPR and PMR satisfy sufficiently different user needs

• Desktop review findings: The HSPR and PMR have similar purposes, meaning that there is no clear evident value in having the reports separately. Having the reports separately risks a lack of alignment between them, duplicated effort in producing separate reports, and users reading only one report rather than both.

• Stakeholder comments: There is no clear benefit to having the reports separate.

96 – 100

101 – 106

1 and

2 u

The reports align to the organisational and governance structures of WA Health

• Desktop review findings: There is a lack of alignment to Health Services governance structures, and in addition to key reporting requirements of the DoH (e.g. Annual Report).

• Stakeholder comments: The HSPR and PMR do not align to Health Services organisational structures – in particular services such as Area Mental Health and Community Services.

107 3 u

Reporting structure and content

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 205: PwC Review

Assessment criteria

Performance monitoring and evaluation

Page 206: PwC Review

PwC

Assessment criteria – PME (1 of 2)

Page 206

April 2015

Sub -section

Performance Monitoring &

Evaluation

Leading Practice Principle

PWC assessment Ref

pages Key Issues

PMR /HSPR meets

criteria

1

(PME) Process

Is the method of monitoring and

evaluating performance appropriate

and successful

PME processes are streamlined and efficient, and there is sufficient time in the PME process for all parties to appropriately review performance

Desktop review findings: Reports are circulated to Health Services within a reasonable timeframe, with good collaboration with Information Development and Management Branch. However, timeframes to review performance is tight. Stakeholder comments: Mixed views on whether the timeframe for review is sufficient – both within the DoH and Health Service.

132 3 u

Appropriate individuals are involved in the PME process

Desktop review findings: Senior members from the DoH and Health Services regularly discuss Health Services performance. However, the involvement of individuals further down the organisational structure in Health Services is inconsistent and not clearly defined. Stakeholder comments: The right people are involved in performance discussions between DoH and Health Services, but involvement of Health Services’ staff is not comprehensive enough.

132 3 u

Sufficient checks performed to validate the accuracy of reported performance

Desktop review findings: The lack of drill-down access to data makes it difficult for DoH and Health Services staff to validate results for certain indicators. Stakeholder comments: There is no drill down available in the PMR summary report, due to the report being published in PDF. There is no ability to drill down into externally provided data, due to access restrictions.

51 6

(Performance Indicators)

4

Performance Monitoring and Evaluation (PME)

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 207: PwC Review

PwC

Assessment criteria – PME (2 of 2)

Page 207

April 2015

Sub -section

Performance Monitoring &

Evaluation

Leading Practice Principle

PWC assessment Ref

pages Key Issues

PMR /HSPR meets

criteria

1

(PME) Process

Is the method of monitoring and

evaluating performance appropriate

and successful

Roles and responsibilities in PME process are clearly defined

Desktop review findings: Respective DoH / Health Services’ roles and responsibilities from a performance monitoring and evaluation perspective are not clearly defined or sufficiently detailed within the PMF. This raises the risk that there is confusion and mixed expectations between the DoH and Health Services, which could lead to disputes and certain key PME activities not being undertaken.

Stakeholder comments: No accountability or roles and responsibilities are defined within either the PMR or HSPR meaning further drill down to ascertain those responsible for improvements is needed.

131 2 5

PME meetings cover all key elements required to discuss and assess performance

Desktop review findings: Although there are many advantages with the new process (see page x), there are some issues which should be addressed: • Limited documentation on how the HSPR and Health Service performance overall is

discussed at these meetings, such as set agendas, terms of reference, etc.. • Limited time allocated within the meetings to discuss performance in detail. • The discussions focus largely around Health Services level performance as opposed to facility

level. Stakeholder comments: There is insufficient time allowed within meetings to discuss underperforming facilities or likely drivers of the performance ratings.

130 1 5

There is clear accountability for who is going to take actions arising from the PME process

Desktop review findings: Action plans arise from performance meetings, which are followed up at subsequent meetings. However, there is therefore a lack of clear documentation on how interventions are triggered under the current PME processes. Stakeholder comments: Action plans do arise from meetings, which are followed up.

133 – 134 4 u

Performance Monitoring and Evaluation (PME)

Independent review of the HSPR and PMR

The review focussed on assessing against the below criteria.

Key: 4Meets criteria u Partially meets criteria 5 Does not meet criteria

Page 208: PwC Review

Supporting leading practice research

Appendix G

Page 209: PwC Review

PwC

KPIs were established to provide information (either qualitative or quantitative) on the effectiveness of programs in achieving objectives in support of respective outcomes. Performance indicators are just that: “an indication of organisational achievement”. They are not an exact measure, and individual indicators should not be taken to provide a conclusive picture on the facility’s achievements. A suite of relevant indicators is usually required, and even then an interpretation of their results is needed to make sense of the indicators. The focus should be on selecting a robust set of value-adding indicators that serve as the beginning of a rich performance discussion focused on the delivery and improvement of the services.

The National Health Reform Performance and Accountability Framework (NHRPAF) was designed to facilitate the achievement of key national health policy objectives through clear and transparent performance reporting. Based on our leading practice research and the NHRPAF, the criteria developed to evaluate the appropriateness of an entity’s KPIs are included below:

1. Translates the hospital’s performance in quantifiable metrics (measurable and comparable)

2. Informs different stakeholders on the hospital’s performance (ease of access and stakeholder relevance)

3. Enables stakeholders to follow up, coordinate, control and improve (aspects of) the hospital’s performance (accountability)

4. Relevant to strategic objectives and national priorities (strategic alignment)

5. Scientifically sound, using valid and reliable data (statistically appropriate)

6. Administratively simple and cost effective to collect and report against (efficient)

To achieve agency-level accountability, indicators must not only be accurate and reliable, but must also be relevant. The relevance of indicators should be reviewed and endorsed independently in conjunction with stakeholders and key audience members to ensure buy in and objective reasoning. Relevant and up-to-date KPIs are more likely to be valued by the agency and the staff collecting data for them. If KPIs aren’t providing useful and meaningful information to decision makers they will not be assisting the agency to measure the achievement of its goals.

Defining the performance indicators (1 of 2) Ensuring an appropriate suite of PIs is defined is paramount to success of the PMF, as without measuring the right data you can’t manage the right problems.

Page 209

April 2015 Independent review of the HSPR and PMR

Page 210: PwC Review

PwC

Performance indicators that are useful exhibit a number of characteristics. They should be aligned to strategic objectives and intent, be relevance, be timely, be accurate, and exhibit completeness and comprehensiveness. These are discussed further below with key additional attributes.

Defining the performance indicators (2 of 2) Ensuring appropriate performance indicators are used is essential to validate the performance score of the organisation.

Progress KPIs

Fit for purpose: Performance KPIs

Relevant

Aligned

Accurate

Timely

Complete & comprehensive

• Realistic – fits into the agency’s constraints and cost effective • Understood by audience • Agreed by audience - all contributors agree and share

responsibility

• Evaluated over an appropriate time frame • Sufficient time given for goal set

• Measurable • Accurate to be used to identify trends • Credible • Goals are attainable

• Relevant and specific • Specific – clear and focused to

avoid misinterpretation or ambiguity

• Aligned to the strategic objectives and vision of the organisation

Page 210

April 2015 Independent review of the HSPR and PMR

Page 211: PwC Review

PwC

Drivers of appropriate targets

leading practice research suggests targets should be set using the following drivers:

1. Use existing information and review trends and history.

2. Consider variations in performance, e.g. peaks, troughs and seasonal factors.

3. Take account of national targets, best practice benchmarks, etc..

4. Take into account the cause-and-effect relationships, e.g. don’t set top level outcome targets before you have set appropriate targets for the enablers and inputs

5. Take into account time lags (consider the Balanced scorecard and the time lags between the objectives).

6. Take account any dependence on others such as partner bodies.

Setting good performance targets and thresholds Ensuring appropriate targets and thresholds that are both achievable but ambitious is necessary to ensure improvements can be driven.

Page 211

April 2015

Consequences of inappropriate targets

It is important to carefully consider targets and thresholds to ensure they are appropriate – they should be achievable yet aspirational. A report published by the Australian Healthcare and Hospitals Association demonstrates the potential negative consequences that can result from setting targets:

• Tunnel vision: concentrating on clinical areas measured to the detriment of other important areas

• Sub-optimisation: pursuing narrow objectives which have more immediate impact on the performance rating, at the expense of broader strategic coordination

• Myopia: concentrating on short-term issues while neglecting long-term criteria

• Gaming: altering behaviour to gain strategic advantage

• Misrepresentation: including creative accounting and fraud

• Toxification: increasing pressure on employees poisoning the workplace environment as organisations trying to meet targets

• Overload: overload of unused information

As such, the Better Practice Report suggests the use of the SMART model to ensure that targets are appropriate. 0

2

4

6

Category 1 Category 2 Category 3

Target

Actual

Gap

Independent review of the HSPR and PMR

Page 212: PwC Review

PwC

Setting SMART targets

• Our leading practice research has shown that appropriate PIs provide a fair and balanced representation of each outcome, output, or activity.

• With this, it is also important to set achievable and aspirational targets to provide a benchmark for minimum quality standards, and establish a goal for improvement towards realisation of potentialities.

• As such, PIs should have appropriate targets to achieve its purpose. The Better Practice Report recommends the application of the SMART model to assess the appropriateness of each PI. This is shown in the diagram to the right.

• It is acknowledged that not all PIs can be quantified in a way that completely adheres to SMART standards. In such instances, organisations should have sufficient information to explain why this is the case, to support the credibility of the target.

Setting good performance targets and thresholds Targets and thresholds are necessary benchmarks for performance, but must be credible and evidence-based to truly motivate performance.

Page 212

April 2015

S M A R T

Specific Measurable Achievable Realistic Timed

Describe exactly what needs to be

done

Achievement or progress

can be measured

Target is achievable,

but aspirational

Target is possible to

attain (motivational)

Time period for

achievement is clearly

stated

Independent review of the HSPR and PMR

Page 213: PwC Review

PwC

x

Performance Monitoring & Evaluation Producing a good PMF, PIs and reporting tools is not enough. For performance improvements to be successfully driven, an effective PME is needed.

Page 213

April 2015

• For performance management to be clear and meaningful to service providers, performance monitoring and evaluation (PME) processes must be robust and have appropriate levels of accountability.

• The Australian Public Service Commission’s guidelines for performance management demonstrates that an effective PME process underpins high performance.

• As shown in the figure to the right, the four principles of effective performance management frameworks are:

1. Purpose and clarity

2. Alignment and integration

3. Mutuality and motivation

4. Adaptability and progress.

• These principles are underpinned by three foundation elements required in a PME Process to drive high performance:

1. Capabilities

2. Evidence and data

3. Pragmatism.

Per

form

ance

man

agem

ent

fram

ewo

rk

PM

E P

roce

ss

Purpose and

Clarity

Creates clarity

in what high

performance

represents, and

a clear purpose

Alignment and

integration

Alignment of

performance

measures with

high-level

strategic

objectives

Mutuality and

motivation

Promotes

accountability

and drives

motivation

towards high

performance

Adaptability and

Progress

Performance

measures are

adaptable in a

changing

environment and

progress towards

government

outcomes

High

Performance

Independent review of the HSPR and PMR

Page 214: PwC Review

PwC

Reporting structure and content best practice Looking at other national examples of performance management reporting, it is clear effective presentation is critical to ensuring messages are relayed.

Performance ratings / scorecards depicted should have the following attributes: • Easy to read • Colour coded with easily recognisable key (green is

good, red is bad) • Data included within the rating to increase reader

understanding • Simplistic layout wherever possible to limit too much

detail being shown and the key messages being lost • No information overload to ensure information can

be accessed and utilised • Clear explanations accompanying composite scoring

to reduce the risk of misinterpretation.

Page 214

April 2015

Commentary provided should have the following attributes: • Clearly laid out • Use of colour of easily

locating of relevant sections • Key findings section

summarises the report • Use of bullet points to

ensure critical points are highlighted

• Commentary supported with data

• Easily readable text (both on screen and printed)

• Consistent layout / formatting each month

• Key message clearly highlighted and separated to draw attention to reader.

Independent review of the HSPR and PMR

Page 215: PwC Review

Suggested HSPR / PMR combined report structure overview

Appendix H

Page 216: PwC Review

PwC

Further to the recommendations made in section 6 (reporting structure and content) please see below suggested components for the new report.

Page 216

April 2015

Reporting structure and content

This section is provided as an indicative structure to some of the sections of the new report. These are provided as examples only, and further consultation, adaptation and design will be required to ensure they are fit for purpose.

Recommendation 2 Option C: Combined report

This option merges the HSPR and PMR reports to produce a single “Combined Report”.

Key features:

• The “Summary Report” is published each month, and the “Detailed Report” is published quarterly.

• The Summary Report presents results for a set of Priority PIs, which are a refined set of the PIs currently reported in the HSPR

• This report also includes additional PIs from the Detailed Report that have been consistently underperforming, to ensure that key areas of underperformance are highlighted and addressed.

• The Summary Report provides trend analyses for these PIs to better illustrate how performance is changing

• The Detailed Report presents performance results for a more detailed set of PIs, which are grouped by domain.

• Both reports have integrated commentary written by both the DoH and the Health Services.

Independent review of the HSPR and PMR

Key recommended steps to implement this option:

1. Identify the suite of PIs to be included in the Summary Report.

2. Establish data sourcing frequencies to report against these PIs monthly.

3. Modify business processes to commence monthly data reporting and collection for the Summary Report.

4. In consultation with end users, design the Summary Report dashboard and layout of content.

5. In consultation with end users, design the Detailed Report, including areas to flag consistent underperformance.

6. Establish business rules around Summary Report content for consistently underperforming areas (generally this should be a critical performance threshold).

The following pages provide some examples of how some sections of the new report could be structured - please note that these are illustrative examples of some changes which could be made, and are not meant to represent a comprehensive structure and design.

Combined report: Overview of potential components

Page 217: PwC Review

PwC

Example components: Front page

Page 217

April 2015

Front page (Section 1): Health Service summary at a glance

The Health Service summary at a glance highlights key areas of performance and underperformance specific to the Health Service.

Key features:

• Similar to the ‘at a glance’ graphic utilised for the WA Health Annual Report.

• Provides high level summary for Health Service, with drill-down data in following sections.

• Highlights areas of consistent high performance across facilities.

• Highlights areas of consistent underperformance across facilities.

• Can be used to focus high-level performance conversations.

Independent review of the HSPR and PMR

Includes overview commentary from DoH and the Health Service.

Provides high level summary for Health Service, with drill-down data in later sections

Highlights areas of consistent high performance across facilities.

Highlights areas of consistent underperformance across facilities.

The front page should include a summary of the main performance results, and contain key commentary messages from DoH and Health Services.

Page 218: PwC Review

PwC

Example components: Priority PIs

Page 218

April 2015

Section 2: Health Service priority PIs components

This section tracks performance against a consistent set of Priority PIs on a monthly basis.

Key features:

• Priority PIs are reported for each Health Service (rather than all Health Services).

• Targets and variances are included to place current performance in context.

• Trend analysis is included to demonstrate changes in performance over the reporting periods.

• Reporting month is included for each trend period, so that it is clear which timeframe the results relate to.

• Commentary is provided against each PI by DoH and Health Services. Where intervention is triggered, the level of intervention is shown.

Independent review of the HSPR and PMR

The reporting period month is clearly shown, linking results with data frequency.

Commentary from DoH and the Health Service is provided against each PI to ensure meaningful interpretation of results.

Priority indicators are grouped by domain.

Domain Indicator Frequency Target Actual Variance4 Periods

Previous

3 Periods

Previous

2 Periods

Previous

1 Period

Previous

Current

Period

Nov-14 Dec-14 Jan-15 Feb-15 Mar-15

6 5 3 2 5

Mar-14 Jun-14 Sep-14 Dec-14 Mar-15

10 12 10 6 7

Jun-11 Jun-12 Jun-13 Jun-14 Jun-15

9 12 15 15 16

Domain IndicatorIntervention

levelHealth Serv ice commentary

Greater availability of data on X has

increased performance rating for this

Reduced activ ity in X is associated with

lower performance. Activ ity has decreased Improved performance against X savings

scheme contributed to higher performance

this period.

Department of Health commentary

Indicator 1 is back on target following a

decline over the last two report preiods.

Indicator 2 is below target, continuing the

downward trend from Jun 2014.

Indicator 3 is exceeding target and

improved from previous report period.

2

-

-

Domain 1

Indicator 1

Indicator 2

Indicator 3

-

(3)

1

Domain 1

5 5

10 7

15 16Indicator 3 Annually

Indicator 2 Quarterly

Indicator 1 Monthly

The priority PIs section should show performance results for each of the priority PIs, specific to each Health Service.

Page 219: PwC Review

PwC

Example components: Exception-based PIs

Page 219

April 2015

Section 3: Exception-based reported PIs components

The Exception-based reported PIs section reports on the performance of additional PIs (which are usually only shown in the detailed section) where performance is below the designated intervention threshold, as per PME Recommendation 4.

Key features:

• Similar structure to section 2 (Priority PIs).

• Indicators shown are Detailed Report PIs, where performance is below intervention threshold.

Independent review of the HSPR and PMR

Domain Indicator Frequency Target Actual Variance4 Periods

Previous

3 Periods

Previous

2 Periods

Previous

1 Period

Previous

Current

Period

Nov-14 Dec-14 Jan-15 Feb-15 Mar-15

6 5 5 2 3

Mar-14 Jun-14 Sep-14 Dec-14 Mar-15

10 8 10 6 7

Jun-11 Jun-12 Jun-13 Jun-14 Jun-15

9 12 10 10 8

Domain IndicatorIntervention

level

Reduced activ ity in X is associated with

lower performance. Activ ity has decreased

Indicator 43 3Indicator 43 has declined, which has

triggered intervention level 3.

Action plans are being rev ised in line with

new intervention level.

Department of Health commentary Health Serv ice commentary

Domain 1

Indicator 22 2Indicator 22 continues to underperform,

despite a slight increase in performance.

Greater availability of data on X has

increased performance rating for this

Indicator 37 2Indicator 37 continues to underperform,

despite a slight increase in performance.

(3)

Indicator 43 Annually 15 8 (7)

Domain 1

Indicator 22 Monthly 5 3 (2)

Indicator 37 Quarterly 10 7

Only PIs which are below intervention threshold are shown.

The exception-based PIs section should show performance results for additional PIs that are below a particular threshold which triggers an intervention.

Page 220: PwC Review

PwC

Example components: Health Service Comparison

Page 220

April 2015

Section 3: Health Service Comparison components

The Health Service comparisons section allows performance results among Health Services to be compared for each of the priority PIs. This section can provide the Health Service with an indication of how they compare to other Health Services, and puts performance results in context.

Key features:

• Similar to current HSPR dashboard.

• Contains only latest period data, without trends.

Independent review of the HSPR and PMR

Domain Indicator Frequency Target CAHS NMHS SMHS WACHS

Indicator 1 Monthly 5 6 5 3 2

Indicator 2 Quarterly 10 10 12 10 6

Indicator 3 Annually 15 9 12 15 15

Domain 1

The reporting period month is clearly shown, linking results with data frequency.

Priority indicators are grouped by domain.

Health Service results are displayed side-by-side for easy comparisons.

The Health Service comparison section should be similar to the current HSPR dashboard.

Page 221: PwC Review

PwC

Example components: Site-based comparison

Page 221

April 2015

Section 4: Site-based Comparison components

The site-based comparisons section allows performance results among facilities to be compared for each of the priority PIs. This is the same as already presented in the HSPR, and can provide the Health Service with an indication of how their facilities compare to others, putting performance results in context.

Key features:

• Exactly the same as current HSPR site-based pages.

• Should have a page for each priority PI shown in section 2.

• Should be used to inform commentary in section 2.

Independent review of the HSPR and PMR

This is the same as currently included in the HSPR.

Targets are shown and indicated by the red line to easily depict over- or under- performance.

A facility comparison graph is prepared for each priority PI.

The site-based comparison section should be similar to the existing site-based comparisons in the HSPR.

Page 222: PwC Review

PwC

Example components: Detailed dashboard Like the PMR, the detailed report section should include performance results for the full suite of PIs by domain, specific to the Health Service.

Page 222

April 2015

Section 5: Detailed Report Dashboard components

The Detailed Report should replace the current PMR dashboard, and show performance results for the full suite of PIs by domain, specific to the Health Service.

Key features:

• Similar in structure to the Summary Report sections.

• PI results are reported by domain. Each domain will have a separate section for PI drill-down.

• Similar site based charts for facility based performance as are currently used in the PMR should be retained.

• Commentary is only written at domain level.

Independent review of the HSPR and PMR

Domain 1

Indicator Frequency Target Actual Variance4 Periods

Previous

3 Periods

Previous

2 Periods

Previous

1 Period

Previous

Current

Period

Nov-14 Dec-14 Jan-15 Feb-15 Mar-15

9 12 15 15 16

Mar-14 Jun-14 Sep-14 Dec-14 Mar-15

10 8 8 9 9

Health Serv ice commentary

Performance results for this domain continue to exceed targets. This is demonstrated at the indicator level above.

Performance continues to improve for Indicator 1, and has remained steady for Indicator 2.

Performance results are reflective of improvements in X activ ity and the increased availability of data. X activ ity is

expected to improve which should continue to be reflected in performance results.

Department of Health commentary

1

Indicator 2 Quarterly 8 9 1

Indicator 1 Monthly 15 16

The example below demonstrates the content for Domain 1, and this format will be replicated for Domain 2.

DoH and Health Service commentary are at the domain level only.

Page 223: PwC Review

PwC

Combined report: Backing information The combined report should include backing information to guide the users’ understanding of the results presented.

Page 223

April 2015

Backing information to include:

1. Introduction Provide background information on the PMF and how this report sits within the framework.

2. Purpose / objectives of the report Describe the purpose of the report (and intended audience), and how this links with the PMF.

3. High-level information about the data Provide high-level information about:

1. Data source Internal and external data sources

2. Storage and security How the data received is stored at DoH

3. Exclusions Any data exclusions that impact upon the results in the current report

4. Contact information Provide details of the appropriate contact within DoH to query the report or seek further information.

Independent review of the HSPR and PMR

Page 224: PwC Review

Glossary

Appendix I

Page 225: PwC Review

PwC

Final report – glossary

Page 225

April 2015

Abbreviation Description

NW Rating: numbers withheld

OS Rating: out of scope

PC2 Percentage of cases coded

PI Performance indicator

PME Performance monitoring and evaluation

PMF Performance Management Framework

PMR Performance Management Report

SAB Staphylococcus aureus bacteraemia

SHEF State Health Executive Forum

SMART Specific, measurable, achievable, realistic, timed

SMHS South Metropolitan Health Service

SW3 Staff turnover

WACH WA Country Health Service

WAU Weighted activity units

WHO World Health Organisation

Abbreviation Description

ABF/M Activity Based Funding/Management

CAHS Children and Adolescent Heath Service

DG Director General

DoH Department of Health

Domain

The term domain is used throughout the report to refer to the 5 domains in the case of the PMR (Effectiveness, Efficiency, Equity, Sustainability and Processes)and 3 in the case of HSPR (Access, Quality and Efficiency)

DRGs Diagnosis Related Groups

EA1e NEAT targets

ED Emergency department

EI1 Volume of weighted activity

FTE Full time equivalent

HiPs Hospital initiated postponements

HSPR Health Service Performance Report

KPI Key performance indicator

KPQ Key performance question

NA Rating: not available

NC Rating: not calculable

NHRPAF National Health Reform Performance and Accountability Framework

NMHS North Metropolitan Health Service

Independent review of the HSPR and PMR

Page 226: PwC Review

Document list and sources

Appendix J

Page 227: PwC Review

PwC

Documents issued from DoH

Page 227

April 2015

Document Title

2012-13 Activity Based Funding/Management 9ABF/M) KPI Stress Test Report 2013/14 Proposed Intervention Levels 2014-15 Activity Based Funding/Management Performance Management Monthly Report December 2014 2014-15 Health Service Performance Report (HSPR) Quality Composite score KPI Stress Test Report Activity Based Funding / Management Performance Management Report: Performance Indicator Definitions Manual (Health Service Measures) 2014-2015 Activity Based Funding / Management Performance Management Report: Performance Indicator Definitions Manual (Outcome Measures) 2014-2015 Activity Based Funding and Activity Based Management: 'Performance score' Activity Based Funding/Management Performance Management Report: 2014-15 Automation Roles and Responsibilities Activity Based Funding/Management Performance Management Report: 2014-15 User Guide version 1.0 Activity Based Funding/Management Performance Management Report: PMR Indicator Review 2013/14 Data Quality Statement: EA2 NEAT % of ED Attendees with LOE <= 4 hours Health Reform Project Meeting - Minutes: CAHS Safety Quality and Performance 27/01/2015 Health Reform Project Meeting - Minutes: North Metropolitan Health Service 23/01/2015 Health Reform Project Meeting - Minutes: South Metropolitan Health Service 03/02/2015 Health Reform Project Meeting - Minutes: WACHS 19/01/2015

Health Service Performance Report (HSPR) December 2014 Health Service Performance Report (HSPR) December 2014 Commentary Health Service Performance Report (HSPR) January 2015 Health Service Performance Report (HSPR) January 2015 Commentary Health Service Performance Report (HSPR) September 2014 Health Service Performance Report v2.0 – Indicator Overviews

Health Service Performance Report: 2014/15 Roles, Responsibilities and Timelines HSPR and PMR Stress Test Report: Historical Data Performance Management Report scorecard: December 2014 Schedule B KPI Targets and Thresholds 2014-15 for Performance Management Report Schedule of Performance Indicator Targets, Target Source, and Thresholds for the 2014-15 Health Service Performance Report (HSPR)

Listed below are the documents issued from DoH that have been used in the review.

Independent review of the HSPR and PMR

Page 228: PwC Review

PwC

Source data used (1 of 2)

Page 228

April 2015

Listed below are the additional data sources that have been used in the review.

Document Title

A Guide to the Queensland Government Performance Management Framework August 2012 A Performance Measurement Framework for the Canadian Health System Activity and performance in NSW public hospitals: Hospital Quarterly October to December 2014 Australian National Audit Office: Criteria for the Audit of Key Performance Indicators Better Practice in Annual Performance Reporting CAHS ABM/ABF and Reconfiguration Reform Program Can we improve the heath system with performance reporting? Care at a Glance Month 9 2014/15 Cool Practice Guide to Performance Management for Nurses and Midwives in Victorian Health Services Corporate Performance Reporting Revisited: The Balanced scorecard Department of Health Annual Report 2013-14 Developing Performance Indicators: Information Package Financial versus non-financial information: The impact of information organisation and presentation in a Balcned scorecard Independent Hospital Pricing Authority Independent Inquiry into care provided by Mis Staffordshire NHS Foundation Trust January 2005 - March 2009 Intellectual Disability Network June 2014 Investigation into Mid Staffordshire NHS Foundation Trust Mid Staffordshire NHS Trust Report: Executive Summary Mid Staffordshire NHS Trust Report: Volume 1 Mid Staffordshire NHS Trust Report: Volume 2 Mid Staffordshire NHS Trust Report: Volume 3 Model Framework for Assessment of State, Performance, and Management of Canada's Core Public Infrastructure: May 2009 National Health Reform: Performance and Accountability Framework NHS Outcomes Framework Publication Schedule 2015 NHS Standard Contract 2014/15 Particulars NHS Trust: Performance Management Strategy and Framework NHS Trust: Trust Performance Management Framework 2013/14 NSW Health 2014/15 Schedule E: Performance Measures NSW Health 2014/15 Service Agreement Key Performance Indicators and Service Measures Data Dictionary NSW Health Annual Report 2011-12

Independent review of the HSPR and PMR

Page 229: PwC Review

PwC

Source data used (2 of 2)

Page 229

April 2015

Listed below are the additional data sources that have been used in the review.

Document Title

NSW Health Surgical Dashboard May 2014

NSW Performance Framework 2013-14

Organisational Performance Management in a Government Context: Literature Review (Scottish Government)

Performance Audit Report: KPIs

Performance indicators used internationally to report publicly on healthcare organisations and local health systems

Performance measurement for health system improvement: experiences, challenges and prospects

Performance Monitoring and Evaluation: Performance Assessment Tool (PAT) Framework Document

Performance Reporting: Value Driver Analysis

Quarterly report on the performance of the NHS foundation trust sector: 3 months ended 30 Jun 2014

Queensland Health Hospital and Health Service Performance Management Framework

Queensland Health: Guide to Optimising Own Source Revenue

scorecards, dashboards, and KPIs: keys to integrated performance measurement

System Performance Management Reform Project Initiation

TEC Guidance Document: Performance mea surement

TG0023 Performance Measurements: Concepts and Techniques

The 2014/15 Accountability Framework for NHS Trust Boards

The NHS Outcomes Framework 2014/15

The NHS Outcomes Framework 2015/16

University of Gavle: Managing Performance Measurement

Victorian Health Service performance monitoring framework 2014-15

Victorian Health Service performance monitoring framework: 2012-13 business rules

Victorian Health Service performance monitoring framework: 2014-15

WA Health KPI Presentation

WA Health Reform Program: Project to Build Capability

WA Health Reform: Integrated program roadmap

Independent review of the HSPR and PMR

Page 230: PwC Review

Commentary on recent DoH Safety and Quality Indicator review

Appendix K

Page 231: PwC Review

PwC

Commentary on the HSPR S&Q indicator review The Patient Safety and Clinical Quality Division (PSCQ) have undertaken a recent review on the suite of S&Q indicators which should be in the HSPR.

Page 231

April 2015

Background

• Since the introduction of the HSPR, there has been some significant changes in relation to S&Q indicators, including the removal of the quality composite score from the HSPR.

• In addition, the FPPG has commissioned the PSCQ division to review the S&Q indicators in the HSPR.

S&Q indicator workshop

• The PSCQ held a workshop with key DoH and Health Services stakeholders to review the S&Q indicators included in the HSPR, and recommend changes as required.

• PwC attended in an observing role.

• The outcome aim of this workshop was to recommend to FPPG “a set of S&Q indicators in the HSPR that provide a balanced, realistic view of the current situation within each Health Service in regard to the safety and quality of the care that Health Service is providing”.

Workshop outputs

• Attendees selected (through a voting process) four indicators for proposed inclusion in the HSPR, which represented a change in one indicator from the current suite (see right).

• Attendees also noted that they believe that patient and staff experience measures should also be eventually introduced, however only when data quality for these are improved.

Current list of S&Q indicators included in the HSPR:

Workshop proposed set of indicators for inclusion in the HSPR:

Independent review of the HSPR and PMR

Performance indicator

Healthcare associated Staphylococcus aureus bacteraemia infections per 10,000 occupied bed days

Death in low-mortality DRGs

Hand Hygiene Compliance

Rate of total hospital readmissions within 28 days to an acute designated mental health inpatient unit

Performance indicator Participant votes

Healthcare associated Staphylococcus aureus bacteraemia infections per 10,000 occupied bed days

12

Death in low-mortality DRGs 8

Hand hygiene compliance 7

Rate of community follow up of mental health patients in 7 days from discharge

5

Page 232: PwC Review

PwC

Commentary on the S&Q indicator review PwC have been asked – as part of the independent review - to comment on the S&Q indicators proposed by the workshop.

Page 232

April 2015

Independent review comments on S&Q indicators proposed by the workshop

• A clear and consultative process was followed including stakeholders from throughout the DoH and Health Services.

• All four PIs are key measures of S&Q performance, and are consistent with prominent PIs in other jurisdictions.

• In particular the proposed alternative mental health metric (community follow-up) is preferable to the existing metric (hospital readmission within 28 days).

− This is partly because the later can be an indication of poor performance (high readmissions), but also good performance (quick readmissions).

• However, as identified in Performance Indicators: Key Issue 3 on pages 45 and 46, some key S&Q metrics which are given prominence in other jurisdictions are not included in this list. This is also a consistent theme in other domains.

− There appears to have been a general perception at the workshop that increasing the number of HSPR S&Q indicators (4) may not be accepted, so focus was placed on picking the top 4.

− If the total list of PIs is maintained at 4, having 2 infection related PIs creates some imbalance.

The table below shows the results of the voting and ranking process for S&Q indicators.

The Y/N in the “HSPR suite” column indicates that the indicators are proposed to be in included in the HSPR.

(source: PSCQ report - Review of safety and quality indicators for inclusion in the Health Service Performance Report)

Independent review of the HSPR and PMR