Designing Influential Evaluations Session 6 Analysis & presentation

25
Designing Influential Evaluations Session 6 Analysis & presentation Uganda Evaluation Week - Pre- Conference Workshop 19 th and 20 th May 2014

description

Designing Influential Evaluations Session 6 Analysis & presentation. Uganda Evaluation Week - Pre-Conference Workshop 19 th and 20 th May 2014. What does the client want?. Why is the work being commissioned? Lesson-learning Accountability Scaling up - PowerPoint PPT Presentation

Transcript of Designing Influential Evaluations Session 6 Analysis & presentation

Page 1: Designing Influential Evaluations Session 6 Analysis &  presentation

Designing Influential EvaluationsSession 6Analysis & presentation

Uganda Evaluation Week - Pre-Conference Workshop19th and 20th May 2014

Page 2: Designing Influential Evaluations Session 6 Analysis &  presentation

2

What does the client want?Why is the work being commissioned?

◦Lesson-learning◦Accountability◦Scaling up ◦Transferring a model to other situations/

contexts◦To support a spending decision

How might this affect your approach to the report? Discussion

Page 3: Designing Influential Evaluations Session 6 Analysis &  presentation

3

Understanding the intervention & context

Simple contexts

• Stable, clear, linear cause and effects relations – assessing impact straightforward

Complicated

contexts

• Cause and effect known but not well understood; need expert management; evaluate by deconstructing into simple problems

Complex contexts

• Impossible to predict in advance; only understand after the event

Known knowns

Known unknowns

Unknown unknowns

Kurtz C & Snowden D (2003) The new dynamics of strategy: sense making in a complex and complicated world. IBM Systems Journal 42: 462-483

Page 4: Designing Influential Evaluations Session 6 Analysis &  presentation

4

How to pitch the analysis Guidance on quality of a study

(DFID)Principles of quality

Associated principles Yes/No

Conceptual framing

 

Does the study acknowledge the existing body of research?

 

Does the study construct a conceptual framework?  

Does the study pose a research question?  

Does the study outline a hypothesis?  

Openness and transparency

Does the study present the raw data it analyses?  

Does the author recognise limitations/weaknesses in their work?

 

Appropriateness and rigour

 

Does the study identify a research design?  

Does the study identify a research method?  

Does the study demonstrate why the chosen design and method are good ways to explore the research question?

 

Validity Has the study demonstrated measurement validity? (Are the indicators it uses good representations of the things it seeks to measure?)

 

Is the study internally valid? (Does it demonstrate how causality is established through the selected technique?)

 

Is the study externally valid (Can it be generalised to other contexts and populations?)

 

Reliability Has the study demonstrated measurement reliability?  

Has the study demonstrated that its selected analytical technique is reliable?

 

Cogency Does the study present a clear and logical argument?  

Are the conclusions clearly based on the study’s results?  

Page 5: Designing Influential Evaluations Session 6 Analysis &  presentation

DiscussionWhat tools or presentations can

you use to convey how strong or robust your findings are? Discuss in your group and prepare to exchange ideas in plenary.

Page 6: Designing Influential Evaluations Session 6 Analysis &  presentation

6

Validity

The extent to which a causal conclusion based on a study is warranted

Reflects the extent to which bias is minimised

Essential for a satisfactory evaluation

The validity of generalized (causal) inferences in evaluation studies

The extent to which the results of a study can be generalized to other situations and to other people

Depends on the scope to collect necessary data and the effects of context

Internal External

Page 7: Designing Influential Evaluations Session 6 Analysis &  presentation

7

How to demonstrate strength of findings

Limitations in data collection

Data qualityStatistical analysisSignificance testsConfidence

intervalsProbability Data available for

re-analysis

Limitations in data collection

Accuracy of records & note-taking

Consistency in documentation

Formal textual analysis

Comparisons and ranking

Scaling and rating

Quantitative Qualitative

Page 8: Designing Influential Evaluations Session 6 Analysis &  presentation

8

Developing the argument Start with the question to be answered What were the findings Strength of the findings

◦ Statistical analysis – level of confidence◦ Extent of interview response – consistency and high

percentage responses; or lack of any trend or pattern How well are they corroborated?

◦ Triangulation of method◦ Triangulation of data sources

Need to present data in a table or graph?◦ Comparisons across locations, stakeholder groups or time◦ Tables to have % in the body and total for rows and columns

Remember to explain the limitations Consider other plausible explanations (contribution

analysis) Draw conclusions/ implications

Page 9: Designing Influential Evaluations Session 6 Analysis &  presentation

9

Tips for effective readingSignpost wellBreak up text with boxes, tables and figuresLots of white spaceStructure paragraphs logicallyMix sentence length and complexity. Short is

good.Cross reference and remind the readerSummarise frequentlyCarry findings through logically to conclusionsDraw lessonsDirect clear recommendations to specific

people or organisations

Page 10: Designing Influential Evaluations Session 6 Analysis &  presentation

10

Introduce and signpost Example

“The following chapter presents findings on whether staff have access to appropriate training and technical advice to effectively ensure evaluability and results documentation as part of the grant management process (Hypothesis 2). Under technical advice, we consider the systems that are in place for quality assurance and expert review.

 

The evidence for this chapter draws on a number of sources: 1) A review of training material and post course evaluations; 2) Attendance records of training; 3) the results of the online staff survey; 4) findings from the focus group discussions; and 5) the comparative analysis of grant management processes of other development agencies.

 

The chapter is divided in two parts: first the findings from our review of training are discussed; this is then followed by the findings related to technical support.”

Page 11: Designing Influential Evaluations Session 6 Analysis &  presentation

11

Paragraph structureGood guidance from Norad“Chapters presenting findings A body paragraph shall be allocated for each finding. Findings presented shall be evidence-based, triangulated and clearly referenced. The finding shall be presented as a clear topic sentence. This shall be followed by presentation of the relevant data, quotations, references, and analysis that shows how and why the evidence presented supports the position taken in the topic sentence. Included herein is also the presentation of the comparisons with other studies, significant trends if any, uncertainties, and limitations relevant for the analysis presented.”

Page 12: Designing Influential Evaluations Session 6 Analysis &  presentation

Paragraph example Example“Minimum requirements on results measurement are not consistently understood by staff. While the Grant Management Manual outlines a number of requirements on results measurements (See Table 3 above) the understanding of these among staff is mixed. On one hand, the staff survey indicated that 83 percent felt they had a clear understanding of the minimum requirements that a partner’s results framework should have. On the other, the interviews and focus groups revealed that staff felt very unsure of what the Norad/MFA approach to results measurement was and felt that individual staff members were given too much individual discretion to decide what is good enough when appraising results frameworks. Box 5 provides an illustration of some of the views we heard.”

Page 13: Designing Influential Evaluations Session 6 Analysis &  presentation

13

Timelines are effectiveYear Strategy/Process Objectives/Highlights1994 -1998 Re-engineering Programme Strategic management process,

corporate score card, Country Strategic Opportunities Papers (COSOPs), Project Portfolio Management System (PPMS).

1995 IFAD Vision To provide leadership; work through partnerships; be a catalyst to mobilise resources, and develop as a knowledge organisation.

1998-2000 Meeting Challenges in a Changing World: IFAD’ Strategic Framework 1998-2000

Five thrusts identified: (a) Support projects and programmes driven by beneficiary participation; (b) Create an effective portfolio management system that ensures desired field-level results (c) Ensure an effective IFAD presence and impact at the field level (d) Establish and use knowledge networks on rural poverty in order to create a recognized knowledge organisation (e) Develop human resource and management systems that support accountability, teamwork and decentralised decision making and other goals.

2000-2005 Process Re-engineering Programme (Strategic Change Programme)

To intensify IFAD’s impact and its role as a knowledge organisation. IFAD business process architecture; new IT and information systems; human resources. Introduction of log frame, key file in project formulation; strengthening annual portfolio review.

Page 14: Designing Influential Evaluations Session 6 Analysis &  presentation

14

Table layout

Title of training course Very

effectiveEffecti

ve

Slightly

effective

Ineffective

n=

Objectives, results and risk management in grant management

33% 51% 13% 2%100%45

Objectives, results and risk management in grant management specialisation

50% 36% 11% 3%100%36

Review and evaluation in grant management (n)

36%(8)

45%(10)

14%(3)

5%(1)

100%22

Page 15: Designing Influential Evaluations Session 6 Analysis &  presentation

15

Summarise frequentlyMajor findings about the policies, systems and procedures for grant management (summary box at end of section or chapter)

The number of Grant Scheme Rules and the variation in requirements around goal achievement, quality assurance and evaluation present a confusing and inconsistent set of procedures for staff to follow.

The minimum requirements on results measurement articulated in the Grant Management Manual are not adequate to ensure evaluability. In particular, the level of information required of partners during the planning of grants on how results will be measured is not sufficient. In addition, the content of the current minimum standards are not consistently understood by staff.

While there is reference to results measurement across a number of policies and guidelines, together the documents fail to provide a coherent body of guidance material that supports staff in the practical task of appraising results frameworks, supporting partners in measuring results and ensuring the evaluability of grants.

Page 16: Designing Influential Evaluations Session 6 Analysis &  presentation

16

Carry through to conclusions

Implementation of a results-focus fails to ensure evaluability, partly because there is little clarity about minimum standards, but also pressures of time on staff and a lack of incentives to prioritise results.

Page 17: Designing Influential Evaluations Session 6 Analysis &  presentation

17

Conclusions & Lessons

What makes a good conclusion?◦ Responds to the

questions◦ Concludes against

evaluation criteria

RecommendationsWhat makes a

good recommendation?◦ Concise◦ Addressed to a

specific person or organisation

◦ Indication of urgency or priority

Some evaluators argue that evaluations should not

make recommendati

ons

Conclusions and recommendations

Page 18: Designing Influential Evaluations Session 6 Analysis &  presentation

18

Clear simple lessonsIndependence is critical to evaluation credibility: In addition to being directed from IFAD’s own Independent Office of Evaluation substantial efforts were made to demonstrate full independence by the use of a steering committee and of high level independent advisors. Evaluation quality matters: The analysis in the IEE Final report was widely judged to have been sound and was accepted by the Executive Board, President and Senior Management Team. The imprimatur of the independent advisors helped achieve that.Useability of recommendations is facilitated by reliance on reform initiatives already underway: The recommendations were far reaching, but a core built on reforms that had already been piloted. In this way the IEE endorsed the ideas of reformers in IFAD and gained their support.

Page 19: Designing Influential Evaluations Session 6 Analysis &  presentation

19

Recommendations for Norad’s Evaluation Department Designing an evaluation:Tighten the design specifications for evaluations. Draft terms of reference with tighter specifications for the purpose, objective and scope of evaluations so it is clear when outcome or impact is to be evaluated in addition to outputs. Keep evaluation questions focused. Reduce the number of evaluation questions that are to be covered by an evaluation so that resources are clearly prioritised to key results. Require evaluators to clearly describe the programme logic of the intervention being evaluated. All evaluations should be required to specify the programme logic or reconstruct if necessary as a basis for the design.Be more specific in terms of reference about the required consultants’ skills. More consideration should be given to the specific skills and expertise required for either the team leader or core team members. This would require EVAL to do more preparation up front around which evaluation designs and methods are best suited to answer the evaluation questions.

Page 20: Designing Influential Evaluations Session 6 Analysis &  presentation

20

The evaluation

Analysis – core source

of information

Commissioning process

Stakeholder consultation

VideoOn-line access

Press release

Statisticalsynopsis

Full Report

Short summaries

Topic briefs

Meetings &workshops

Elements of communication

Social media

Page 21: Designing Influential Evaluations Session 6 Analysis &  presentation

21

Communication optionsWorking in small groups brainstorm channels

of communication and the audiences they are most likely to reach.

Create a table like the one below to illustrate your answer and for discussion in plenary.

AudienceChannel

Government

Minister

Department Civil

servants

Project Staff

Citizens

Formal report

Etc.

Etc.

Page 22: Designing Influential Evaluations Session 6 Analysis &  presentation

22

A communication strategy

Purpose:To ensure the results of any evaluation are communicated clearly, accurately and in a way that audience can use the information

Consider before you start….◦Use and useability

Communication

Dissemination

Page 23: Designing Influential Evaluations Session 6 Analysis &  presentation

23

Use and useabilityUseability

an evaluation design shapes how it’s outputs can be used

Potential users know why an evaluation is taking place

Time studies to match decision-making

Render evidence and data for the ‘non-technical’

Tailor material according to audience

Usedependent on useability of

design

Evaluative culture and organisational context◦ Findings attached to further funding◦ Value and priority given to evaluation◦ Organisational ‘insertion’ or position ◦ External influence (pressure/

independence)

User characteristics◦ Forward looking to improvement ◦ Involvement in design◦ Capacity to respond◦ Internalized purposes for evaluation

Page 24: Designing Influential Evaluations Session 6 Analysis &  presentation

24

Communicating effectivelyAn evaluation is not complete until its

communication is complete.Depends where on continuum evaluation lies

from modifying an intervention to modifying policy; re-shaping thinking about a problem & lesson learning

Availability of tracking and follow-up systems Whether and when to put in the public

domain or not?◦ if there are serious criticisms◦ if confidentiality might be broken◦ if it tackles a sensitive issue

Page 25: Designing Influential Evaluations Session 6 Analysis &  presentation

END