EQAS for operations evaluations

68
measuring results, sharing lessons EVALUATION QUALITY ASSURANCE SYSTEM (EQAS) Guidelines For Operation Evaluations Prepared by: WFP Office of Evaluation Last updated: October 2015

Transcript of EQAS for operations evaluations

Page 1: EQAS for operations evaluations

mea

suri

ng

resu

lts,

sh

arin

g le

sso

ns

EVALUATION QUALITY ASSURANCE SYSTEM (EQAS)

Guidelines For Operation Evaluations

Prepared by:

WFP Office of Evaluation Last updated: October 2015

Page 2: EQAS for operations evaluations

Foreword

The Evaluation Quality Assurance System (EQAS) is one of the building blocks for implementation

of WFP’s Evaluation Policy (2008). As such, it is WFP’s Office of Evaluation’s primary means of

safeguarding the international evaluation principles of:

Independence: by setting standards that increase impartiality in the evaluation process and

in reporting on findings;

Credibility: by setting standards that ensure evaluations are evidence-based and follow

transparent and systematic processes; and

Utility: by building milestones into evaluation processes for timeliness and reporting

standards to ensure accessibility.

EQAS builds on the norms and standards of the UN Evaluation Group, the OECD-DAC Evaluation

Network, related tools from the Active Learning Network for Accountability and Performance, and

on the wider evaluation literature and community of practice.

EQAS is a comprehensive system covering all types of evaluations: strategic, policy, country

portfolio, impact, operations and synthesis evaluations. It guides all evaluations undertaken by

WFP’s Office of Evaluation and its consultants and also applies to decentralised evaluations

managed by Country Offices and Regional Bureaux.

EQAS is a working tool for WFP’s evaluation staff and consultants covering all stages of the

evaluation cycle. It is not a comprehensive handbook on evaluation and does not replace the rich

range of evaluation literature.

Initiated in 2007, the EQAS is subject to periodic and systematic review and update in line with the

Office of Evaluation’s evolving needs and international best practice. Updates and new materials

will be incorporated as they become available.

Helen Wedgwood Director, Office of Evaluation

Page 3: EQAS for operations evaluations

TABLE OF CONTENTS

1. DEFINITION AND CRITERIA ...................................................................... 3 1.1. WFP Operations ....................................................................................................... 3 1.2. Operation Evaluation - Definition ........................................................................... 4 1.3. Operation Evaluation - Selection ............................................................................. 6 1.4. Operation Evaluation - Framework ......................................................................... 6 1.4.1. Evaluation criteria ............................................................................................... 6 1.4.2. Evaluation questions ........................................................................................ 8

2. PRINCIPLES AND ETHICAL STANDARDS ................................................ 10 2.1. Evaluation Principles ............................................................................................. 10 2.2. Code of Conduct and Ethical Standards ................................................................. 11

3. ROLES AND RESPONSIBILITIES .............................................................. 12 3.1. New Operation Evaluations model ........................................................................ 12 3.2. Office of Evaluation ............................................................................................... 13 3.3. Evaluation Company .............................................................................................. 14 3.3.1. Evaluation Manager ....................................................................................... 14 3.3.2. Evaluation Team ............................................................................................ 15 3.4. Country Office (CO) ............................................................................................... 15 3.5. Regional Bureau (RB) ............................................................................................ 16 3.6. Other internal stakeholders ................................................................................... 16 3.7. Recourse mechanisms ........................................................................................... 16

4. STEP BY STEP PROCESS GUIDE ............................................................... 16 4.1. Preparation ............................................................................................................ 18 4.2. Inception ................................................................................................................ 26 4.3. Evaluation .............................................................................................................. 33 4.4. Reporting ............................................................................................................... 38 4.5. Dissemination and Follow-up................................................................................ 46 4.6. Summary of process steps by evaluation phase..................................................... 48

5. CONTENT GUIDES AND TEMPLATES ....................................................... 50 5.1. Content guides ....................................................................................................... 51 5.1.1. Content guide for the Inception Package ....................................................... 51 5.1.2. Content guide for the Evaluation Report ....................................................... 51 5.2. Quality checklists ................................................................................................... 51 5.2.1. Inception Package Quality checklist .............................................................. 51 5.2.2. Evaluation Report Quality checklist .............................................................. 51 5.3. Evaluation Products Templates ............................................................................. 52 5.3.1. Inception Package Template .......................................................................... 52 5.3.2. Evaluation Report Template .......................................................................... 53 5.3.3. Evaluation Report Comments Matrix Template ............................................ 56 5.4. Evaluation process templates ................................................................................ 57 5.4.1. Evaluation Proposal Template ....................................................................... 57 5.4.2. Inception Package Submission Template ...................................................... 63 5.4.3. Evaluation Report Submission Template ...................................................... 64

Acronyms ......................................................................................................... 65

Page 4: EQAS for operations evaluations

2

INTRODUCTION

1. In the context of renewed corporate emphasis on providing evidence and accountability for results under the World Food Programme’s (WFP) Strategic Plan for 2014-2017 and organisational strengthening endeavours, WFP has committed to increase the evaluation coverage of single operations in complement to the Office of Evaluation’s (OEV) more complex evaluations of policies, strategies, country portfolios and impact of core activities.

2. Consequently, OEV has designed a new approach for Operation Evaluations (OpEv), which foresees outsourcing the management and conduct of these evaluations. Long-term agreements have been established with reputable companies for this purpose and these guidelines, which have been designed to initiate and test this approach, are primarily intended for them. Yet, in publishing them, the intention is also to provide transparency to third parties: WFP colleagues, organisations and institutions affected by WFP evaluations and users of the evaluation results.

3. These guidelines aim to clarify the quality standards for Operation Evaluations in terms of both the evaluation process and products and to provide guidance accordingly. As such, it deals primarily with conceptual and organisational matters rather than with the technicalities of the evaluation research process. The guidelines are structured as follows:

Definition and criteria (chapter 1) locates operations as WFP’s standard unit of intervention; defines evaluations of single operations including their intended purpose and describes the guiding framework.

Principles and ethical standards (chapter 2) presents the evaluation principles guiding WFP’s evaluation function, processes and products as well as the Code of Conduct and other ethical standards that WFP evaluation staff and associated companies and consultants should respect.

Roles and responsibilities for Operation Evaluations (chapter 3) describes the new approach and model for Operation Evaluations and the ensuing roles and responsibilities of the Evaluation manager (EM) and Evaluation team as well as other stakeholders.

Step by step process guide (chapter 4) covers the main evaluation process steps, with an emphasis on those that the evaluation company is responsible for. It is divided in five sections corresponding to the five evaluation phases. Each section begins with a description of the purpose of the phase and of the resulting products and presents an overview of the main tasks. The tasks are then further described in a series of sub-sections. A summary table synthesises the main steps identifying respective roles and responsibilities.

Content guides and templates (chapter 5) aim to assist the evaluation teams in preparing for, and drafting the evaluation outputs with a clear sense of what is expected. These documents also aim to support the company’s quality assurance process of the evaluation outputs by providing a transparent reference as to what is expected.

4. These guidelines draw on similar evaluation management handbooks and tools, adapting the information to the specific requirements of evaluating WFP operations. They do not aim to replace text books and other literature on evaluation.

5. First issued in October 2013, the guidelines were revised in December 2014 after the ‘proof of concept’ review of the new approach and model for Operation Evaluations as well as a workshop with the external Evaluation Managers. The latest updated version of this document is available at http://www.wfp.org/evaluation.

Page 5: EQAS for operations evaluations

3

1. DEFINITION AND CRITERIA

1.1. WFP Operations

6. The World Food Programme (WFP) is the United Nations’ food assistance branch of the United Nations and the largest humanitarian agency. The organisation’s mandate is to design and implement strategies, policies and activities related to food assistance with the aim to save lives in emergency situations; to improve the food security and nutrition of those unable to access sufficient food for active and healthy lives; and to promote the self-reliance of poor people and communities1.

7. The “operation” is WFP’s standard unit of intervention and its main instrument to reduce food insecurity. At any given time, WFP implements about 160-180 operations of different types and durations in about 75-80 countries where the organisation is present. There are five different types of operations:

Emergency Operations (EMOP) provide the framework for WFP to respond to natural and man-made disasters that threaten people’s lives and livelihoods. Their main objectives are to save lives, reduce malnutrition and protect livelihoods. EMOPs last on average 16 months.

Protracted Relief and Recovery Operations (PRRO) provide the framework for WFP to respond to protracted relief and recovery needs during and in the aftermath of emergencies and to support long-term refugees and internally displaced persons (IDPs). Their main objectives are to re-establish and stabilize livelihoods and food security. PRROs last on average 3 years.

Country Programmes (CP) and Development Projects (DEV) provide development assistance to assist marginalized groups affected by chronic food insecurity and under-nutrition to meet their short-term food needs in ways that build longer-term human and physical assets. These operations last on average five years.

Special Operations (SO) are implemented in support of other operations to enhance the timely and efficient delivery of food assistance through e.g. logistics augmentation or the provision common services to the humanitarian community (common logistics or telecommunication services or coordination of food security interventions). These operations last on average 24 months.

Figure 1: Share of operation types by overall requirements and number of operations (2013-2015)

Source: OEV based on WFP Programme of Work as of January 2015

1 See Orientation Guide on WFP for further information.

CP/DEV

11%

PRRO42%

EMOP41%

SO6%

Requirements (US$)

CP/DEV

28%

PRRO33%

EMOP14%

SO25%

Number of Operations

Page 6: EQAS for operations evaluations

4

1.2. Operation Evaluation - Definition

8. An operation evaluation is an assessment of the merit and worth of an operation in relation to a set of evaluation criteria and standards of performance. In line with the OECD/DAC definition which is broadly supported in the international evaluation community, WFP defines an Operation Evaluation as:

‘An assessment, as systematic and objective as possible, of an ongoing or completed WFP operation, its design, implementation and results. The aim is to determine the relevance and fulfilment of objectives, efficiency, effectiveness, impact and sustainability. An evaluation should provide information that is credible and useful, enabling the incorporation of lessons learned into the decision-making process of both recipients and donors.’

9. Operation Evaluations can be conducted as:

Mid-term evaluations. These are undertaken only for the operations with the longest timeframe, namely country programmes and development projects.

End-of-term evaluations. These are undertaken towards the end of an operation and are timed to ensure that findings can be fed into the design of subsequent operations.

10. Operation Evaluations can be commissioned by different WFP business units and be centralised or decentralised. Regardless of the commissioning unit, Operation Evaluations follow the standards established by the Office of Evaluation.

Centralised: commissioned by the WFP Office of Evaluation, or

Decentralised: commissioned by WFP Regional Bureaux (RBs) or Country Offices (COs).

11. Like all WFP evaluations, Operation Evaluations serve the dual and mutually reinforcing purpose of:

Accountability - The obligation to account for the work carried out and the results achieved to internal and external stakeholders, including those who are affected by the operation, those who finance it and other humanitarian and/or development actors.

Page 7: EQAS for operations evaluations

5

Figure 2: Operation Evaluations contribution to accountability

Learning - Learning means that lessons are drawn from experience, accepted and

internalised in new practices to build on successes and avoid past mistakes. As such,

Operation Evaluations should be envisaged as an aid to decision-making with the

view to improve uses of funds and other resources to enhance performance and

results.

Figure 3: Operation Evaluations contribution to decision-making

12. Operation Evaluations are different from and complement other corporate processes contributing to operational accountability and learning, including:

Reviews. A review is an assessment of the performance of an intervention periodically or on an ad hoc basis. The main difference with an evaluation is that a review is a management instrument for operational monitoring focussing on operational aspects and recording results, while evaluation is a more in-depth objective assessment of the aims, implementation and results of an intervention, with the question of attribution as its main focus.

Monitoring. Monitoring is a continuous process coinciding with the implementation of the programme, which aims, on the basis of a fixed set of indicators, to provide a regular insight into implementation progress against the plan. Monitoring is of great importance for evaluations as it provides a significant part of the data on which evaluation is based.

Audits. Audits are intended primarily to check that programme resources have been used according to the rules and usually involve examination of accounts.

Page 8: EQAS for operations evaluations

6

1.3. Operation Evaluation - Selection

13. Operations to be evaluated are selected by OEV based on utility and risk criteria.

14. Utility criteria are used to conduct the first level of screening and ensure that evaluations are planned and conducted with the clear intent to use their results. The operations retained are those:

for which an evaluation is timely to inform future decisions on programme implementation and/or design; and

which have not been recently evaluated.

15. Risk criteria are applied to prioritise the remaining operations using the WFP risk scoring based on a combination of operational factors (e.g. annual budget, procurement volume, etc); external factors (e.g. security level, corruption index) and COs’ internal control self-assessment.2

16. These criteria are applied to the WFP portfolio of operations to create a shortlist of evaluable operations proportionate in size to the number of operations in the region. Regional Bureaux (RB) are then asked to prioritize these operations and make the final selection in consultation with Country Offices (COs).

1.4. Operation Evaluation - Framework

17. The Operation Evaluation definition and evaluation questions refer to the five OECD/DAC evaluation criteria of relevance, efficiency, effectiveness, impact and sustainability. These criteria together with the evaluation questions (see section 1.4.2) provide a framework to assess and report findings and their transparent application to all WFP Operation Evaluations makes for consistency in approach and enables results to be compared.

1.4.1. Evaluation criteria

18. The below provides a description of the OECD DAC evaluation criteria as they relate to WFP Operation Evaluations3. The extent to which equal attention can or will be paid to all criteria will vary per evaluation4.

19. Relevance is the extent to which the objectives of an operation are consistent with beneficiaries’ requirements, country needs, global priorities and partners’ and donors’ policies. The assessment of relevance can be complicated by the fact that an organisation’s policy, the partner’s policy, including national policies and the needs of the target group do not necessarily all perfectly coincide (particularly in complex emergency contexts).

20. More broadly, relevance also refers to the extent to which the effects of operations make a sustainable contribution to achieving the ultimate objective. An operation has been valuable, or relevant, to the extent that it has generated effects that bring achievement of the ultimate development/humanitarian objective.

21. Efficiency is the extent to which the cost of an operation can be justified by its results, taking alternatives into account. Efficiency illustrates the relationship between input and output and measures how economically resources (inputs) and the way they are applied are converted to results.

2 The factors include: i) Operational factors: total annual budget, procurement volume, change in procurement volume, percentage level of programme resourcing, total beneficiary numbers, complexity, number of offices, number of staff, tonnage handled, number of operations handled; ii) External factors: Security level, duty station classification and corruption perception index; and iii) Internal control self-assessment: Assurance statements completed by all organizational entities. 3 This description of the OECD/DAC criteria has been drawn and adapted from Evaluation Policy and Guidelines for evaluations, Dutch Ministry of Foreign Affairs, IOB Department. October 2009. 4 Other criteria should also be considered when and as relevant. These include for example, coverage and connectedness.

Page 9: EQAS for operations evaluations

7

22. In evaluating efficiency, both the quantity and quality of inputs and outputs are assessed. The most economical or cost-effective input is not always the most appropriate and there can be significant trade-offs between the quantity and quality of outputs. Assessing efficiency also calls for comparison with alternative approaches with which the same outputs can be achieved, which can be arduous as benchmarks are often lacking.

23. The following elements should be taken into account when evaluating efficiency:

• Have appropriate inputs been deployed at the lowest possible cost? • Have activities been conducted in an uncomplicated manner? • Has duplication been avoided? • Have outputs been achieved within the planned period and budget?

24. Effectiveness refers to the degree to which the objectives of the operation are fulfilled, taking their relative importance into account. Effectiveness relates to the extent to which the direct results of operations (outputs) contribute to the sustainable achievement of the objectives (outcomes) that the operations are intended to achieve.

25. In evaluating effectiveness, the aim is to establish causality between outputs generated by the operation and the observed effects and thus the extent to which they can be attributed to the operation. Yet, the effects of an operation are not in WFP’s direct control and other factors influence the observed changes, which complicates their attribution to the operation. As counterfactuals are not available for Operation Evaluations, evaluators should make use of transparent and credible plausibility argumentation to determine attribution.

26. Three steps can be distinguished in the evaluation of effectiveness:

• The measurement of changes in the effect variables in comparison to the situation at the start (baseline);

• Attribution of the observed changes to the operation;

• Assessment of changes observed and attributed to the operation in terms of the objectives.

27. An operation is considered effective if its outputs have made a demonstrable contribution to achievement of the operation’s intended objectives. It should be noted though that an effective operation can be inefficiently implemented while an operation can be efficient without producing the intended effect.

28. Impact While in the context of the logical framework, the concept of impact usually refers to the highest level of results (extent to which the outcomes achieved have contributed to broader, more far-reaching objectives at a higher level), the OECD/DAC defines impact as all significant effects produced by an operation, directly or indirectly, intended or unintended, on the ultimate stakeholders and third parties.

29. In the international literature on impact evaluations, the concept of impact is used for the net effects attributed to the operation, i.e. at the level of effectiveness.

30. Sustainability is the degree to which the desired effects of an operation last beyond its end. Since an activity can hardly be considered effective if the effect it has achieved is not lasting, sustainability is in fact an aspect of effectiveness.

31. The concept of sustainability comprises a variety of dimensions:

• The involvement of the recipient (e.g. through agreement on objectives) is usually recognised as a factor that affects the success of an operation;

• Institutional and capacity development plays an important role in determining sustainability.

Page 10: EQAS for operations evaluations

8

• Degree to which measures have been taken to guarantee that activities can continue and completed works can be maintained in the future.

• When deploying technology it is important to take account of the financial and institutional capacities of the recipient and the degree to which the conditions for maintenance and renewal are guaranteed;

• Operations can have an effect on the environment in the short or long term. While short-term effects are usually envisaged and are prevented or mitigated, long-term environmental can neutralise positive changes or have other harmful consequences.

32. Other criteria that may be used are derived directly from the gender and equity principles and include participation, social transformation, inclusiveness, empowerment, etc. The use of these is strongly encouraged.

1.4.2. Evaluation questions

33. Evaluation questions have been standardised for Operation Evaluations (see table one) to ensure that operations are assessed using a similar framework thus enhancing comparability of results. However, specific issues of concern locally as raised by the intended users may be highlighted. These questions should be considered giving special attention to gender and equity issues5.

5 Examples of questions used to assess gender equality can be found in table 2.4, page 30 of UNEG Integrating Human Rights and Gender Equality in Evaluation - Towards UNEG Guidance.

Page 11: EQAS for operations evaluations

9

Table 1: Evaluation questions D

ES

IGN

Question 1: How appropriate was the operation at design stage and remained so over time?

Areas for analysis include the extent to which the objectives, targeting, choice of activities and of transfer modalities:

Are appropriate to the needs of the food insecure population from different social groups.

Are coherent with relevant stated national policies, including sector policies and strategies (in complex emergency contexts the alignment should refer to civil society’s interests rather than government objectives).

Are coherent with WFP strategies, policies and normative guidance. Seek complementarity with the interventions of relevant humanitarian and development partners as well as with other WFP interventions in the country, if relevant.

Re

lev

an

ce

, E

ffic

ien

cy

, E

ffe

cti

ve

ne

ss

, Im

pa

ct,

Su

sta

ina

bil

ity

RE

SU

LT

S

Question 2: What are the results of the operation?

While ensuring that differences in benefits between women, men, boys and girls from different social groups are considered, the evaluation analyses:

the level of attainment of the planned outputs;

the extent to which the outputs led to the realisation of the operation objectives as well as to unintended effects;

how different activities of the operation dovetail and are synergetic with other WFP operations and with what other actors are doing to contribute to the overriding WFP objective in the country.

The efficiency of the operation and the likelihood that the benefits will continue after the end of the operation.

IMP

LE

ME

NT

AT

ION

Question 3: Why and how has the operation produced the observed results?

The evaluation should generate insights into the main internal and external factors that caused the observed changes and affected how results were achieved including, amongst others:

Internally (factors within WFP’s control): the processes, systems and tools in place to support the operation design, implementation, monitoring/evaluation and reporting; the governance structure and institutional arrangements (including issues related to staffing, capacity and technical backstopping from RB/HQ); the partnership and coordination arrangements; etc.

Externally (factors outside WFP’s control): the external operating environment (political and security context, evolving needs, government policies, etc.); the funding climate; donors perceptions and support; geographical factors; external incentives and pressures (including costs); etc.

Page 12: EQAS for operations evaluations

10

2. PRINCIPLES AND ETHICAL STANDARDS

2.1. Evaluation Principles

34. The 2008 WFP Evaluation Policy commits the organisation to the systematic application of evaluation principles in its evaluation function, processes and products.

35. These principles of independence, credibility, utility and quality are inter-related and underpin the evaluation objectives of:

Accountability in that they provide the framework to ensure independent, credible, high-quality and useful evaluation of results, whether they are successes or shortfalls;

Learning in so far that it requires independent, credible, high-quality and useful evaluations to generate essential lessons that will help improve programme performance and outcomes.

Figure 4: Interrelated evaluation principles6

36. Independence means that the evaluation is free from influences that would bias its conduct, findings, conclusions or recommendations. WFP is committed to safeguarding the independence of evaluation and to reduce biases to the extent feasible. Independence is fundamental to ensure impartiality of evaluation throughout the selection, conduct and reporting on evaluations, and therefore contributes to the credibility, quality and utility.

37. Credibility is the extent to which evaluation findings and conclusions are believable and trustworthy. Credibility depends on the impartiality of evaluators, evaluation processes and products, the transparency of evaluation processes and the quality of evaluation products, including the soundness of evaluation methods, the quality of data and the clear presentation of findings and conclusions. WFP is committed to ensuring the credibility of evaluation to ensure that evaluations give as accurate an assessment as possible.

38. Utility is the degree to which evaluations are useful to decision-makers within WFP and outside. Utility is achieved by: planning and conducting evaluations with the clear intent to use their results; undertaking evaluations in a timely way to inform decision-making processes; and ensuring the accessibility of evaluation insights in various parts of the evaluation process and its products. WFP is committed to ensuring the utility of evaluation so that its insights are used and recommendations accepted and implemented.

6 Source: WFP Evaluation Policy, WFP, 2008. Following the United Nations Evaluation Group peer review of WFP’s evaluation function, a revised evaluation policy will be presented to the Board in end-2015.

Read more: Evaluation Policy

Page 13: EQAS for operations evaluations

11

39. Quality is central to ensure the credibility and utility of evaluations. It is manifest in the accurate and appropriate use of evaluation criteria, the presentation of evidence and professional analysis, the coherence of conclusions with evaluation findings and how realistic the evaluation recommendations are. It is dependent on the independence, impartiality and transparency of the evaluation process and its products. Good quality evaluations also present findings, insights and recommendations in an understandable way so that they are accessible to readers of evaluation reports. These quality standards are part of the EQAS and systematically applied to WFP evaluations.

2.2. Code of Conduct and Ethical Standards

40. As a member of the United Nations Evaluation Group (UNEG), WFP is committed to ensure that its evaluation staff as well as companies and individual consultants engaged in evaluation work respect the UNEG Code of Conduct7. Its provisions, listed in box 2 below, apply to all stages of the evaluation process from the conception to the completion of an evaluation and the release and use of the evaluation results.

Box 2: The UNEG Code of Conduct

41. In addition to the UNEG Code of Conduct provisions, the following ethical standards should also be respected when evaluating WFP operations:

42. Respect and understanding for other cultures. Evaluations are often conducted in a different cultural environment to that of the evaluators. They can therefore be expected to comply with local cultural codes during interviews and other forms of data collection.

43. Universal values. Some cultural customs and practices may be incompatible with fundamental rights and freedoms. Although evaluators should in principle show respect for other cultures, they must not ignore the effect of certain cultural values on gender relations or minorities and other specific groups. In such cases they should act in accordance with the Universal Declaration of Human Rights and the associated international agreements.

44. Gender. Evaluations should be carried out taking account of the different roles performed by men and women and the aim of changing power relations to achieve gender equality. Gender differences should be taken into account in the design and implementation of the evaluation.

45. Responsibility. In many countries a critical stance regarding the government can have severe consequences for evaluators from the country itself. In evaluations involving

7 The Code of Conduct forms part of the LTA between WFP and evaluation companies. It should also be signed and adhered to by individual consultants hired by companies to provide evaluation services to WFP.

UNEG Code of Conduct for evaluation in the UN System

Independence Impartiality

Conflict of interest Honesty and Integrity

Competence Accountability

Obligation to Participants

Confidentiality

Avoidance of Harm Transparency

Accuracy completeness and reliability

Omissions and wrong-doing

Page 14: EQAS for operations evaluations

12

local evaluators it is important that such risks be recognised in time and steps taken to avoid them, for example, by making clear agreements about who is responsible for the content of the report.

46. Assessment of individuals. Evaluations are not assessments of individuals. Yet evaluation reports can include findings on leadership and the management or other qualities of implementers that come close to personal assessments. Evaluators should then make every effort to present the facts of easily traceable cases in an abstract form.

3. ROLES AND RESPONSIBILITIES

3.1. New Operation Evaluations model

47. In the context of renewed corporate emphasis on providing evidence and accountability for results under the Strategic Plan 2014-2017 and the Framework for Action8, WFP has committed to increase evaluation coverage of single operations. This complements OEV’s core work programme of complex evaluations of policies, strategies, country portfolios and of the impact of core WFP activities, which cover multiple operations per evaluation.

48. In OEV’s longer-term vision, evaluations of single operations will be decentralized. However, it will take time to put in place the organizational infrastructure and capacities needed to achieve decentralized evaluations of the necessary quantity and quality to provide robust evidence on the performance of WFP operations, alongside other initiatives. In the meantime for the period 2013 to 2016, OEV has been mandated to commission up to 15 Operation Evaluations on an annual basis (see figure 5).

49. OEV has opted for a ‘highly outsourced model’ to achieve three goals over time:

Short-Term: The evaluation model focuses input from Country Offices and Regional Bureaux on evaluation content and learning in the process, not on evaluation management.

Medium-Term: The learning and accountability gap on evaluations of single operations is filled.

Long-Term: Establishment of sustainable decentralized Operation Evaluations system.

8 See ‘Orientation Guide on WFP, Section 1.3.

Page 15: EQAS for operations evaluations

13

Figure 5: The Operations Evaluation approach

50. Given the evaluation learning objective, the evaluation manager and team will promote stakeholders’ participation throughout the evaluation process within resource and timing limits. The involvement of stakeholders at all stages of the evaluation including focussing the evaluation, shaping the questions to be addressed, identifying credible sources of evidence, reviewing findings and assisting in their interpretation increases the credibility and potential usefulness of evaluation results. Yet, to safeguard the independence of the evaluation, WFP staff will not be part of the evaluation team or participate in meetings with external stakeholders if the evaluation team deems that their presence could bias the responses.

51. The roles and responsibilities of the various parties listed below are in line with the design principles of the Framework for Action, the intent of the 2012 Monitoring and Evaluation Strategy and associated strengthening of WFP’s business processes.

3.2. Office of Evaluation

52. The Office of Evaluation is responsible for setting the standards for Operation Evaluations, developing guidance accordingly, selecting the operations to be evaluated, commissioning the evaluations, ensuring that the model is regularly reviewed and the approach adapted (notably through a proof of concept after year one) and that an annual synthesis report of all Operation Evaluations is presented to WFP’s Executive Board for consideration.

53. With respect to each individual evaluation, the designated OEV focal point will:

Contract management. Select and contract the external evaluation company on the basis of TORs developed in consultation with stakeholders9; manage the overall

9 While OEV has been responsible in year one for setting up the evaluation and drafting the evaluation TOR, this responsibility may shift to the company in subsequent years.

Page 16: EQAS for operations evaluations

14

contractual relationship with each company, including release of payments to the company in line with the contract between it and WFP.

Share standards and guidance. Enable the company to deliver a quality process and report by providing them with the Evaluation Quality Assurance System (EQAS) for Operation Evaluations (including process guidance and templates) and an orientation guide on WFP policies, strategies, processes and systems as they relate to operations; and by facilitating initial communications between the company and the WFP stakeholders.

Approve the evaluation products and in particular the inception package (checking for completeness) and the evaluation report, submit the final report to an external post-hoc quality assessment process to independently report on the quality, credibility and utility of the evaluation and provide feedback to the evaluation company accordingly.

Dissemination and follow-up. Publish the final evaluation report (together with its quality assessment overall rating and management response) on the WFP public website and incorporate findings in other lessons-learning platforms, as relevant. Conduct a feedback survey among WFP internal stakeholders and the evaluation manager and team to document what went well or could have been improved and gather perceptions about the evaluation process, the engagement and performance of the various stakeholders and the quality and utility of the evaluation. Feedback will be used to improve the approach, as required and enhance the quality of future evaluations.

3.3. Evaluation Company

54. Under the new Operation Evaluations model, long-term agreements (LTA) have been established with reputable companies to manage evaluations (team selection, contracting, process management and quality assurance).

55. The company will appoint an evaluation manager in line with the LTA. Any nomination of a new Evaluation Manager should be discussed and agreed upon with OEV. To ensure a rigorous review of evaluation deliverables, the evaluation manager should not be part of the evaluation team.

56. The Evaluation Manager and the evaluation team members provided by the company will not have been involved in the design, implementation or M&E of the operation nor have other conflicts of interest or bias on the subject. They will act impartially and respect the UNEG code of conduct of the profession.

3.3.1. Evaluation Manager

57. The Evaluation Manager is responsible for managing within the given budget the evaluation process and the expectations spelt out in the TOR to deliver timely evaluation products meeting the OEV EQAS standards. In particular, the EM will be responsible for:

58. Team selection and support. Select an evaluation team that meets the requirements of the evaluation TOR. Ensure the team has a gender balance and the appropriate level of gender-expertise for the evaluation subject. Mobilise and hire the evaluation team and provide administrative backstopping (contracts, visas, travel arrangements, consultants’ payments, invoices to WFP, etc). Orient team members on EQAS and the evaluation requirements; providing them with relevant documentation and generally advising on all aspects of the evaluation to ensure that the evaluation team is able to conduct its work.

59. Communication. Act as the main interlocutor between WFP stakeholders and the evaluation team throughout the evaluation and generally facilitate communication and promote stakeholders’ participation throughout the evaluation process.

Page 17: EQAS for operations evaluations

15

60. Quality assurance. This refers to the reliability, traceability and efficacy of the evaluation management process and also the professional rigour and standard of the resulting outputs from these processes. The Evaluation Manager will ensure that the evaluation proceeds in line with EQAS, the norms and standards and code of conduct of the profession and that deadlines are met. The Evaluation Manager will also ensure that a rigorous and objective quality check of all evaluation products is conducted ahead of submission to WFP. As part of the quality assurance, the Evaluation Manager should ensure that all evaluations meet the UNEG gender and equity-related norms and standards in order to integrate gender and equity dimensions.10 This is assessed and reported on an annual basis through the United Nations System-Wide Action Plan Gender Equality and the Empowerment of Women (UN-SWAP).11

3.3.2. Evaluation Team

61. In OpEvs the responsibilities of the Evaluation Team encompass the following:

62. Process and product delivery. Implementing an evaluation process that honour the UN Codes of Conduct and the profession and that engage evaluation stakeholders during all evaluation phases to cultivate ownership and evaluation utility; delivering on time an evaluation product that aligns with WFP standards and requirements.

63. Communication. Thoroughly preparing all interactions with evaluation key stakeholders (teleconferences, email correspondences and face-to-face) and coordinating the planning of these with the Evaluation Manager.

64. Quality assurance of evaluation deliverables: Aligning all deliverables with the WFP EQAS standards by conducting quality assurance on the Inception Package and evaluation draft and final reports, using guidance from WFP Evaluation Technical notes (e.g. gender, evaluation matrix, recommendations, formatting). Ensuring evaluation products meet UNEG standards and norms for evaluations. The evaluation team will also ensure that deviations from the evaluation process plan established within the Inception Package and ToR are brought to the immediate attention of the Evaluation Manager and alternative courses of action are identified.

3.4. Country Office (CO)

65. Facilitate the evaluation process. In particular the CO will:

Assign an evaluation focal point to liaise with the OEV focal point during the preparation phase and with the Evaluation Manager and team thereafter;

Gather key operational documentation and data necessary to the evaluation during the preparation phase and provide this information to the evaluation manager and team before the start of the inception phase.

Support the preparation of the evaluation mission including: Invitation letters for visas; arrange for interpretation if required; prepare a mission agenda in line with the inception package requirements; facilitate the team’s contacts with local stakeholders; organize internal and external meetings (including a briefing and two

10 The aim of equity is not to eliminate all differences so that everyone has the same level of income, health, and education. Rather, the goal is to eliminate the unfair and avoidable circumstances that deprive people of their rights. Inequity can result from a wide range of factors, including political, social, economic, gender, religious, cultural, ethnic, linguistic and geographical factors. In evaluating equity, the aim is to assess what worked and what did not work to reduce inequity, and highlight intended and unintended results for worst-off groups as well as the gaps between best-off, average and worst-off groups. Source: UNICEF, How to design and manage Equity-focused evaluations”. 11 For more information on the UN SWAP, please consult: http://www.uneval.org/document/detail/1452

Page 18: EQAS for operations evaluations

16

exit debriefing sessions); organize site visits; and provide logistics support during the field work (including transportation).

66. Engage as a stakeholder. Participate in a number of discussions with the evaluation team on the evaluation design and on the operation, its performance and results. Individual CO staff members will participate in interviews in their area of responsibility. In addition, CO staff should participate in the evaluation team briefing and debriefing (possibly done in the form of a workshop).

67. Comment on the TORs, inception package and draft evaluation report. Participate in teleconferences with the evaluation manager and team on the evaluation products, as necessary.

68. Prepare a management response to the evaluation and implement it and use the evaluation findings to inform subsequent project design.

3.5. Regional Bureau (RB)

69. The RB will assign a focal point for the evaluation to liaise with the OEV focal point during the preparation phase and with the company Evaluation Manager thereafter, as required. RB staff should be available to the evaluation team to discuss the operation, its performance and results, to comment on the TORs, inception package and draft evaluation report and to participate in briefing and debriefings.

70. The RB will also oversee the evaluation management response and track the implementation of the recommendations.

3.6. Other internal stakeholders

71. While less directly concerned, some HQ divisions might, as relevant, be asked to discuss WFP strategies, policies or systems in their area of responsibility and to comment on the evaluation TOR and report. These include amongst others the Policy and Programme Division, the Nutrition Division, the Emergency Preparedness and Support Response Division, the Gender Office, the Supply Chain Division and the Government Partnerships Division.

3.7. Recourse mechanisms

72. In cases where either the Evaluation Manager or CO/RB encounter challenges with the management, conduct and/or support received for this evaluation, likely to compromise its timely and quality completion, issues should be raised with the OEV focal point.

4. STEP BY STEP PROCESS GUIDE

73. This section focusses on providing a step by step guide for each individual evaluation, which is expected to last between five to seven months. It presents the purpose, tasks and outputs of the five evaluation phases of: 1) Preparation; 2) Inception; 3) Evaluation; 4) Reporting; and 5) Dissemination and Follow-up.

Phase 5: Dissemination & Follow-up

Phase 4: Reporting

Phase 3: Evaluation

Phase 2: Inception

Phase 1: Preparation

Page 19: EQAS for operations evaluations

17

74. The step by step process guide covers the main evaluation process steps, with an emphasis on those that the evaluation company is responsible for, in particular in relation to the evaluation management.

75. It is divided in five sections corresponding to the five evaluation phases. Each section begins with a description of the purpose of the phase and of the resulting products and presents an overview of the main tasks. The tasks are then further described in a series of sub-sections. A summary table synthesises the main steps identifying respective roles and responsibilities.

76. The last section (section 4.6) consolidates the summary tables to present an overview of all the evaluation process steps.

Page 20: EQAS for operations evaluations

18

4.1. Preparation

77. The purpose of the Preparation Phase is to define and design the evaluation and recruit the Evaluation Manager and Team. The main output of the preparation phase is the Terms of Reference (TOR), which provide the first substantive overview of the planned evaluation.

78. Preparation tasks take about two months, but should ideally be conducted well in advance of the planned start of the evaluation notably to ensure sufficient notice so that suitable evaluators can be recruited.

79. In this phase, some tasks are the responsibility of the OEV focal point and others are the responsibility of the Evaluation Manager or the CO. The main steps are as follows:

Table 2: Summary of Key steps – Preparation Phase

80. The earlier gender and equity approaches are incorporated into the evaluation thinking, the higher the chances that they will be thoroughly analysed during its implementation. The OEV focal point and the Evaluation Manager should use this preparation phase to incorporate gender and equity in the evaluation during its planning and preparation stages.

Phase 5: dissemination & Follow-up

Phase 4: Reporting

Phase 3: Evaluation

Phase 2: Inception

Phase 1: Preparation

Page 21: EQAS for operations evaluations

19

Holding discussion with CO and RB

81. The OEV focal point is responsible for initiating a discussion with the concerned CO and RB in order to present the OpEv model and seek CO/RB’s feedback on: 1) the preferred timing for the evaluation (with respect to when the findings would be required to feed into project design processes and in relation to any possible restrictions around the timing of the evaluation mission); 2) the scope and key issues for the evaluation; 3) the required skills given the operation’s portfolio to inform team composition; 4) potential access constraints; and 5) availability of M&E data. During this call, the OEV focal point also shares the list of key operational documentation and data to be gathered by the CO during the preparation phase. The outcome of this initial discussion is documented in a Note-for-the-Record.

Preparing the Terms of Reference

82. The designated OEV focal point is responsible for drafting the evaluation TOR following a review of key documents and preliminary discussions with stakeholders (CO and RB)12. The TOR will follow the OEV proforma for Operation Evaluations, adapting it to the specific characteristics of the operation and to the concerns of the intended users (CO and RB). Figure 6 reproduces the TOR table of contents.

83. The TOR is the master reference document for the evaluation throughout the process. It constitutes the main instrument to inform stakeholders on how the evaluation will unfold and instruct the company on what is expected from them in this assignment. The evaluation is to be conducted in conformity with the TOR.

84. The TOR provides information on the reasons for conducting the evaluation; identifies the competencies and technical skills needed; clarifies the roles and responsibilities of different parties and core stakeholders; and charts out time frames. While the TOR sets broad parameters for the evaluation approach and methodology, these elements will be elaborated further by the team in the inception phase. Most importantly, the TOR also includes:

The evaluation key questions. While these are standard across Operation Evaluations (see section 1.4.2) to ensure that operations are assessed using a similar framework, issues of concern locally as raised by the intended users may be highlighted.

A timetable for the operation evaluation process. This covers all essential steps, together with proposed dates, including those of the evaluation mission, the submission of the evaluation products and commenting periods. This provides a means of ensuring that a clear roadmap is presented and agreed upon with the main stakeholders when the TOR is finalised.

12 In subsequent years, the responsibility for drafting TORs may shift to the evaluation company.

Figure 6: TOR table of contents

Page 22: EQAS for operations evaluations

20

Examples of documents to be gathered Click here for the full list

Operational documents including country

strategies, assessment reports; programme documents; M&E reports; studies by the CO and its partners.

Project Plans beneficiary needs (ben/mt/cash) and other outputs as in project document.

Distribution reports Maps (operational, food security); Information on the CO structure such as

location and number of sub-offices, organigrams, etc.

Procurement and logistics reports (E.g. Pipeline, Projected Needs).

Notes for Record of Coordination meetings (cluster, donor groups, etc.)

Information related to partners (Field level agreements; Memorandum of Understandings; lists of partners by activity and location).

85. The TORs are considered final once stakeholders’ comments have been sought and integrated as deemed appropriate by the OEV focal point. The TOR comments and revision process is recorded by the OEV focal point in a “comments matrix” (see figure 7) with a view to enhance the transparency and credibility of the evaluation process.

86. The TOR may be shared in draft form with the company(ies) selected for the evaluation to allow early selection of the evaluation team. OEV will attempt to do so no later than two months before the planned start of the evaluation.

Gathering key operational documentation

87. Following the initial discussion with OEV, the CO focal point is responsible for gathering all relevant operational documents covering the full spectrum of the project cycle. (See box overleaf). Any other available CO specific document or dataset should be shared. For this purpose, OEV will share with the CO focal point some guidance on how to upload operational documents on Box, a WFP File Sharing Service similar to Dropbox. This minimizes the use of emails and separate document sharing software (See para 107 for further details).

Reviewers: John Smith, ETH CO HoP, Janine Bianca, OMN Prog Advisor

#Reviewer Para Section Stakeholder comment

OEV focal point's comments and

amendments

1 JS 5 2.1 Added : The current PRRO is due to end in June 2015,

in tandem with the current phase of the Productive

Safety Net Programme (PSNP), therefore, the Country

Office will be re-designing a new PRRO phase in 2014,

encompassing a new PSNP which will in turn impact

humanitarian relief assistance since there are initial

discussions to include all chronically food insecure

caseloads (i.e. those that have received relief for the

previous three or five years) in a new PSNP. These

developments will then impact the parameters of a

new PRRO post-June 2015.

Accepted

2 JB 7 2.3 Added : follow-on projects in Ethiopia as well as Accepted

3 JS Deleted : see table 1 for preliminary description of

each stakeholder group

Accepted

4 JS 8 2.3 Added : the Government (and Accepted

5 JS 11 3 Deleted : builds on its predecessor (PRRO 106650) and.

Comment : I don't see the value of mentioning the old

prro that ended dec 2011 and has already been

evaluated, docs are available for review.

Rejected. The PRRO builds on a previous

operation, i.e. was not designed from

scratch. In fact, there was an evaluation of

this operation and the evaluation team

will need to review the extend to which

the recommendations were taken into

consideration in the design of the follow-

up PRRO.

Matrix of comments OpEv. ETHIOPIA PRRO - Terms of Reference

Figure 7: TOR Matrix of comments

Page 23: EQAS for operations evaluations

21

Questions to consider when selecting an Evaluation Team

Is the Team Leader sufficiently

experienced and has he/she demonstrated his/her ability to manage a team?

Are the evaluators credible in terms of competence and reputation?

Does the evaluation team have the necessary mix of skills, experience and country knowledge for this evaluation?

Do the evaluators have appropriate interpersonal skills?

Do the evaluators have the ability to remain balanced, impartial, objective and constructive?

Will the evaluators be able to function effectively as a team?

Will international evaluators have unrestricted access to all operational areas?

Selecting the evaluation team

88. The Evaluation Manager is responsible for selecting the Evaluation Team. The selection of evaluators is an important step and one of the main determinants of the quality and credibility of the evaluation.

Evaluators should be selected well before the evaluation takes place as recruiting evaluators at the last minute may reduce prospects for obtaining a good evaluation.

Participation of national experts and women in evaluation teams is critical.

89. In order to provide a multi-disciplinary perspective on operation results, evaluation teams will usually consist of three or four members, including the team leader. The skills and qualifications needed by the evaluators will vary from case to case, but the following are important considerations:

Evaluation expertise. This is a package of skills including the conceptual and methodological skills required for successful evaluation research. It also includes the communicative skills necessary for creating rapport with stakeholders, facilitating stakeholder participation, and effectively presenting evaluation results to diverse audiences. Organisational skills are also necessary for planning and managing an evaluation research process involving many people.

Technical expertise. The requirement for different technical expertise will vary from operation to operation (to the exception of gender which is a requirement across all OpEvs) and will be detailed in the TOR for each evaluation. The Evaluation Manager should consider the level of gender expertise necessary for the evaluation subject.

Local knowledge. A good understanding of local conditions is often critical to the evaluation success and the evaluation team will benefit from evaluators with local knowledge and language skills to successfully interact with stakeholders.

Organisational knowledge. A consultant familiar with WFP will spend less time learning about the organisation but may be less inclined to challenge its prevalent beliefs. Hence, a mix of evaluators with and without WFP experience is considered to be optimal.

Soft skills. The ability of the team leader and team members to engage with multiple stakeholders, communicate clearly, foster participation, and encourage dialogue is also key to the success of an evaluation.

Independence and absence of bias. Independence from the object of evaluation and freedom from bias are important requirements and key determinants of the credibility of the evaluation.

Page 24: EQAS for operations evaluations

22

90. When selecting the evaluation team members, the Evaluation Manager may select consultants using the following considerations:

Demonstrated track record. However, it is important not to rely entirely on a potential consultant’s authorship of a previous evaluation. Contacting previous clients/employers will allow you to verify the consultant’s track record and actual authorship.

Interviews (by telephone or face-to-face) with the consultants concerned (especially Team Leader) and/or references. Do not rely solely on written references.

Whether the consultants selected have worked together in the past and can function effectively as a team.

Larger teams may bring problems and risks outweighting the benefits of their wider range of expertise. Larger teams are more work for the Evaluation Manager and team leader and also pose a problem in insecure environments.

91. Conflict of interest. A conflict of interest occurs when, because of a person’s work history or possibilities for future contracts, the consultant’s ability to provide an impartial analysis is compromised.

92. For Operation Evaluations, cases are those in which consultants could: i) influence the analysis or recommendations so that they are consistent with findings previously stated by themselves (upstream conflict of interest) or ii) artificially create favourable conditions for consideration in a downstream assignment (downstream conflict of interest). The following rules should apply to the selection of consultants:

Consultants should have had no prior involvement or close family involvement in the design, implementation, decision-making or financing stages of the operation being evaluated.

Consultants should agree not to work with the concerned country office for a period of six months after the end of the evaluation.

Preparing and agreeing the evaluation proposal

93. The Evaluation Manager is responsible for submitting a proposal for the evaluation to the OEV Focal Point. This proposal should follow the template (reproduced in section 5.4.1). It will focus on:

Team composition. Present the proposed evaluation team arrangement; provide brief profiles of the evaluation team members and their CVs; describe respective roles in the evaluation; and explain how the team competencies meet the TORs. The composition of evaluation teams should be gender balanced. The TOR defines the level of expertise needed among the evaluation team on gender and their responsibilities in this regard.

Mechanisms in place to ensure quality. Present the quality assurance process that will be applied to the evaluation; include information on the evaluation manager (which should be consistent with the LTA proposal13); as well profile information of possible additional reviewers mobilised to support the quality assurance process.

Budget. The budget presented should reflect the rates agreed between WFP and the company in their LTA for operations evaluation services. It will detail: 1) management

13 In cases where this is not possible, the EM should be selected in coordination/agreement with OEV/OpEv team.

Page 25: EQAS for operations evaluations

23

fees applicable to this evaluation14; 2) remuneration and number of days per team member and evaluation phase and; 3) translation costs as necessary; 4) travel costs based on economy travel and not exceeding WFP per diem (DSA) rates. The need to budget for local travel costs will be specified in each evaluation TOR. Should the estimated travel costs specified in the evaluation TOR appear to be significantly under or over-estimated, the evaluation budget can be revised down- or upwards following the finalization of the inception package.

94. The evaluation proposal falls under the terms of the existing Long Term Agreement between the company and WFP. As such, it is not a new tender and is not subject to WFP procedures for new procurement procedures. It can be discussed with OEV before or after submission, as required. Revisions may be required until an agreement is reached on team composition, quality assurance mechanisms and budget for the evaluation. If no agreement is reached within a deadline set for each evaluation, OEV may allocate the evaluation to another company. Before approving the final proposal, OEV will consult the CO/RB with regard to the suggested team composition.

Contracting

95. Contracting the company. Once agreement is reached on the proposal, OEV will liaise with the WFP Procurement Division to issue a Purchase Order (PO) for the evaluation. The PO acts as the contract for the services expected to be performed.

96. The final TOR and evaluation proposal agreed between the company and OEV will be annexed to the PO and considered binding documents. The PO will also specify the payment conditions and schedule as follows:

20 % upon signing of the contract 20% upon receipt of complete inception package 60% upon acceptance of the final evaluation report

97. In exceptional cases with unusually high up-front costs (e.g. for travel or security), OEV may agree to a different schedule of payments, such as 20/30/50. Such exception must be justified.15

98. Contracting the evaluation team. This should be done by the Evaluation Manager on the basis of clear job descriptions reflecting the requirements spelt out in these guidelines, in the evaluation TOR and in the Long-Term Agreement. The UNEG Code of Conduct for the evaluation profession should be attached to the contractual documents as evaluators are required to abide to it (see section 2.2) as well as the international civil servants ethics code16 and the Confidentiality Agreement (Annex III of the long-term agreement).

Requesting initial payment

99. Upon receipt of the PO, the Evaluation Manager will send an invoice corresponding to 20% of the total contract value to [email protected] and copied to the OEV focal point and OEV Senior Administrative Assistant (Kathryn Bell-Greco). Payment

14 In the LTAs for Operation Evaluations, three rates were established for the management component depending on the categorisation of the budget of the operation as follows: below US$88 million = small; between US$88 million and US275 million = medium; above = large. Regional Operations covering more than one country will be considered large. 15 WFP Procurement Division’s default for contracts for provision of services is that 100% of payment should be on satisfactory completion of the services – i.e. 0/0/100. Adjustment has already been made to meet the exigencies and market practice for evaluation services. 16 Available at http://www.ilo.org/public/english/ethics/standards.pdf

Page 26: EQAS for operations evaluations

24

will be processed in line with the terms and conditions specified in the Long-Term Agreement.

Handing-over the evaluation management to the company

100. Upon contracting, the OEV focal point will hand-over the management of the evaluation to the company’s Evaluation Manager as the OEV focal point disengages. The hand-over takes the form of a teleconference chaired by the OEV focal point, and including the evaluation manager, as well as CO and RB focal points. Before the teleconference the OEV focal point will share the following documents:

The EQAS guidance (this guide);

CO/RB OpEv guidance;

The WFP Orientation Guide, including relevant corporate policies and strategies and related normative guidance;

Contact details of the CO and RB focal points;

Guidance to access to the Operation Evaluations Extranet (see below).

101. The hand-over teleconference will provide an opportunity to introduce the Evaluation Manager to the evaluation focal points at CO and RB levels and to discuss and clarify the following:

The roles and responsibilities under the outsourced model of the evaluation manager; the evaluation team and stakeholders as well as communication lines and conflict resolution modalities.

The quality standards and arrangements for quality assurance, including the expectations for the evaluation (evaluation process and steps, quality standards for the process and the evaluation products) as well as the processes in place for review and approval of outputs and related schedules.

The purpose and subject of the evaluation referring to the TOR; the expectations from the intended users, including in relation to evaluation process; and related areas that still need clarification.

The CO will provide an update on the status of the operational documentation consolidation.

Registering on the Operation Evaluation Extranet

102. An extranet has been set up to support information sharing related to Operation Evaluations. It provides guidance documents including content guides and templates and has a dedicated page for each evaluation. The EM should remind team members to log in the extranet in order to access guidance documents as well as the key reference materials that are hyperlinked in the WFP Orientation Guide.

Page 27: EQAS for operations evaluations

25

103. Access to the extranet is limited to registered users, including: Evaluation Managers, Evaluation Team Members and CO and RB focal points, and is valid only for the duration of the evaluation. The OEV focal point will provide registration information at the start of the evaluation. The extranet is available on: http://operation.evaluation.wfp.org

Page 28: EQAS for operations evaluations

26

4.2. Inception

104. The purpose of the inception phase is to ensure that the evaluation team develops an in-depth understanding of the evaluation TOR and that stakeholders have a common understanding of what the evaluation is about, how the work is to be performed and

organised, who is to do what, what is to be produced and when deliverables are expected.

105. The main output of the inception phase is the Inception Package (IP), which is the operational plan for the evaluation and provides those most closely involved in the evaluation with an overview of its planning.

106. The inception phase can take up to two months depending on the complexity of the operation. In this phase, the main steps are as follows:

Table 3: Summary of Key steps – Inception Phase

Gathering key documents and set up of document library

107. The CO/RB are encouraged to directly upload the background documents required for the evaluation on Box, a WFP-authorised web-based file sharing platform. The

Phase 5: dissemination & Follow-up

Phase 4: Reporting

Phase 3: Evaluation

Phase 2: Inception

Phase 1: Preparation

Page 29: EQAS for operations evaluations

27

documents will be shared with the evaluation manager and team leader through a url sent by the OEV focal point

108. The Evaluation Manager will:

Gather from external sources documents related to the country context such as policies, strategies and programmes of the Government and other actors as well as relevant reports and studies from think tanks and research institutions.

Review the list of operational documents previously gathered by the CO (See para 88 and related box) and liaise with the CO focal point(s) to ensure that any other available CO specific document or dataset are shared. The Regional M&E Officer can also provide guidance on potential additional resource information.

Consolidate both sources of information and organize document library for the evaluation team, prioritizing most critical documents.

Team orientation

109. The Evaluation Manager should organize an orientation session with the Evaluation Team to discuss:

The purpose and subject of the evaluation referring to:

o The TOR and related additional information obtained by the Evaluation Manager from interaction with key stakeholders (in particular the OEV and CO focal points).

o Expected outputs. o Scheduling and deadlines.

The expectations from the intended users, specifying:

o Participation o Areas that still need further clarification and steps to be taken in relation.

The roles and responsibilities under the outsourced model of:

o the evaluation manager o the team leader o the team members o the stakeholders

The quality standards and arrangements for quality assurance.

o The EQAS guidance (this guide). o Process steps as well as the templates for evaluation products and related

quality standards. o The expectations for the evaluation (evaluation process and steps, quality

standards for the process and the evaluation products). o Review and approval processes.

The corporate and operational documents available:

o The WFP orientation guide highlighting that it provides a quick orientation to WFP, its Mission Statement, Strategic Objectives, programme categories and activities.

o Relevant corporate policies and strategies and related normative guidance. o Documents gathered on the operation and its context. o Missing documents and data gaps.

The communication lines and conflict resolution modalities.

Next steps including purpose and timing of the introductory teleconference, drafting of the Inception Package and inception teleconference.

Page 30: EQAS for operations evaluations

28

Establishing a working relationship with stakeholders – Introductory teleconference

110. Evaluation Manager and Team Leader should seek to cultivate the CO and RB’s participation throughout the entire evaluation process as this contributes to the realisation of the learning objective of the evaluation and is essential to ascertain the CO and RB’s cooperation and support.

111. Early in the inception phase, the Evaluation Manager will hold an introductory teleconference with the evaluation focal points at CO and RB levels and the team members and seek to create a momentum around the evaluation, initiate a direct relationship between the evaluation team and the core stakeholders. Initial interactions should aim to:

Present the evaluation team (share their CVs/bios).

Understand the CO and RB’s perspectives and concerns related to the operation under evaluation and its implementation

Check for any issues arising from the TOR regarding TOR content, team composition or timeline and agree how they will be resolved.

Review respective roles and responsibilities as outlined in Section 7 of the Evaluation TOR, confirm mutual expectations and add any particular items or challenges anticipated regarding this specific evaluation.

Confirm logistical requirements, notably how the CO will help with letter of invitations, field appointments, briefing by UN field security officers, transport in the field, and assistance with booking accommodation in the field during the mission.

Clarify communication (preferred communication channels, regularity of interactions, who else should be involved/copied, through what means and how often do they want to be informed on progress, etc).

Confirm next steps, timing and agree on forthcoming inputs required from the CO and other stakeholders, including a status update on the consolidation of key operational documents (if not completed at preparation phase).

A 2-page Summary TOR may be prepared as a tool for communicating with in-country stakeholders, particularly when the language of the TOR differs from national language(s) of the country.

Advance planning of the logistics arrangements (e.g. invitation letters for visa applications) is recommended.

Desk review of key documents

112. The evaluation team members are expected to conduct a thorough desk review and analysis of existing documentation concerning the operation and associated relevant literature and data. Each evaluation team member should develop a complete understanding of the documented evidence/information concerning his/her part in the Operation Evaluation. This level of preparation is essential to ensure best use of the time in the field which should focus on gathering additional information and data to complement and validate the findings from the desk review.

Technical Briefing

113. Once the evaluation team is familiar with the content of WFP Orientation Guide and key operational documents, the Evaluation Manager should approach WFP Regional Programme Advisers, requesting them to provide briefing to the evaluation team members on specific subject matters as required (M&E, nutrition, emergency response, cash and vouchers, school feeding, social protection, capacity development, etc.).

Page 31: EQAS for operations evaluations

29

Evaluation approach and methods Chapter 5 of the 2013 ALNAP pilot guide on Evaluation of Humanitarian Action provides a thorough overview of evaluation design and methods and can help evaluators and the evaluation managers to assess the pros and cons of different evaluation approaches and methods.

The Inception Package provides the following opportunities:

For the evaluation Manager to assess how the team understands and plans to approach the evaluation.

For the evaluation team to turn the TOR into an achievable plan that is agreed with the evaluation manager.

For the evaluation team to seek clarification of the TOR and highlight tensions that need to be resolved (e.g. conflicting expectations within the organisation concerning the evaluation).

For the evaluation team to plan its work in a coherent way and clarify what they can cover and what they cannot.

For other stakeholders to receive a clear statement of intent by the evaluation team so that they can quickly flag any issues with the proposed approach.

From: Evaluation of Humanitarian Action, ALNAP, 2013.

Preparing the Inception Package

114. The Inception Package is produced by the evaluation team under the responsibility of the team leader. It assures the Evaluation Manager and stakeholders in the evaluation that the team has a good grasp of what is expected. It ensures ownership by the team of the evaluation process and a shared understanding between the team and WFP stakeholders (OEV/RB/CO) about expectation of the evaluation and quality standards. See section 5.1.1 for the inception package content guide, section 5.2.1 for the quality checklist and section 5.3.1 for the template.

115. The Inception Package forms the agreement between the Evaluation Manager, the Evaluation Team and WFP stakeholders on the operational plan for the evaluation. Fundamental issues are those that affect the evaluation methodology and fieldwork where Evaluation Manager and Evaluation Team (leader) do not agree. Disagreements between the Evaluation Manager and the Evaluation Team will have to be resolved before the Inception Package is considered complete and shared with WFP stakeholders.

116. The Inception Package is made up of a set of tools and includes: a country context and operational factsheets, an operational map, an analysis of stakeholders as well as information on the methodology and organisation for the evaluation.

117. Methodology. WFP requires that a sound methodology using a combination of mixed methods to be applied. This combination relates to the design of the evaluation and to qualitative and quantitative methods and techniques of data collection and analysis. The evaluation matrix and data collection tools should be presented in the Inception Package.

118. The methodological choices will be determined by the extent to which the methods lead to the collection of reliable data and provide a basis for reaching valid and reliable judgments.

Page 32: EQAS for operations evaluations

30

The methods and techniques used can vary per evaluation. Ensure the methods employed are appropriate for analysing gender and equity issues identified in the evaluation scope and that gender and equity aspects are integrated into the evaluation criteria. (See Technical Note on gender)

119. The validity of an evaluation is determined by the extent to which the design of the evaluation, the method of data collection and the analysis of the data lead to replicable answers to the questions posed in the TOR. The evaluation design and methods should be based on an assessment of the pros and cons of different evaluation approaches and methods.

Organisation of the evaluation. The following elements will be provided: team composition, the consultant’s Work Plan, field work schedule and the support required. Add a list of the support required and provider source during the evaluation (e.g. office space, transportation, etc).

Evaluation Team Inception teleconference(s) with stakeholders

120. Key elements of the IP should be discussed between the evaluation team and the CO and RB during the inception phase to ensure that the IP reflects the agreement on how the evaluation will be implemented. In particular, those elements include the draft site mapping, sampling criteria for site selection, stakeholder analysis, field schedule and evaluation matrix. The Evaluation Manager should share those in a draft form together with a list of issues to be clarified with stakeholders and an overview of the missing data.

121. Once the CO and RB have had the chance to review those documents, the EM should facilitate an inception teleconference(s) between the team leader and core stakeholders –CO and RB. The inception teleconference is led by the Evaluation Team Leader. Its main objective is to ensure that there is a shared understanding between the team and WFP stakeholders (OEV/RB/CO) about expectation of the evaluation and that practical/feasibility considerations (e.g. distances, security, etc) have been taken into account.

122. Following the Inception Teleconference, the evaluation team will complete the draft IP.

Conducting quality assurance of the Inception Package 123. The Evaluation Manager is responsible for ensuring that an objective and rigorous quality assurance of the draft Inception Package is undertaken by the company before it is shared with stakeholders.

124. Quality assurance is entirely delegated to the company, so the company is fully responsible for the quality of the evaluation products delivered.

125. The quality assurance process may be conducted by the Evaluation Manager and/or complemented by technical experts to be mobilised independently and at no additional cost to WFP. It should ensure that the Inception Package:

Meets the OEV expectations and quality standards in terms of both content and form. Attests that sufficient research, stakeholder consultations and analysis have been

undertaken to decide on the methodology of the evaluation and to guide its conduct.

Page 33: EQAS for operations evaluations

31

Box 2: Quality criteria for inception package

126. The IP quality checklist (see section 5.3.1) can be used as a guide to help assess the quality of the IP, structure the comments and provide systematic and constructive feedback to the evaluation team. Box 2 above outlines key quality criteria for the IP.

Circulating the draft Inception Package to stakeholders for comments 127. The Evaluation Manager is responsible for circulating the draft Inception Package to WFP stakeholders for comments (OEV, RB, CO) together with an introductory note clarifying the purpose of the Inception Package and its implications for the preparation of the evaluation mission. The fact that the Inception Package is a working document and that revisions should be kept to fundamental issues affecting the evaluation methodology and fieldwork should be clearly underlined. To ensure that the CO and RB’s comments are focussed on the most relevant elements of the Inception Package, the Evaluation Manager should also share the CO/RB OpEv guidance at the same time [Under preparation - to be shared by OEV subsequently]. Stakeholders should be given one week to provide comments.

Finalizing and submitting the Inception Package

128. Stakeholders’ comments are important to correct important factual errors and provide additional data and information, as required. The Evaluation Manager is responsible for consolidating all comments received on the inception package in a (provided in section 5.2.3). This document consolidates all stakeholders' comments, listing for each the name of

Quality criteria for Inception Package A quality inception package should:

Present an evaluation approach relying on a mix of methods allowing for triangulation of evidence from various sources.

Demonstrate that the evaluation methods have been selected for their rigour in producing evidence to address the evaluation questions.

Clearly state limitations of the chosen approach and methods. Incorporates gender dimensions in the context analysis and

methodology design. Include an evaluation matrix displaying for each of the evaluation

question, the sub-questions that will be answered, the data that will be collected to inform that question and the methods that will be used to collect the data.

Present data collection tools to be used (e.g. interview guides, focus group topic list). These should be aligned to the evaluation matrix.

Present a clear and realistic plan of work and timetable that takes existing constraints into account.

Include a work plan with allocation of roles and responsibilities within the team, any deadlines for intra-team reporting and for the company’s internal quality assurance.

Page 34: EQAS for operations evaluations

32

the commentator as well as the related section and paragraph, and serves to guide the finalisation of the inception package.

129. Given the independent nature of the evaluation, the evaluation team should exercise judgment as to which comments merit follow-up and revise the inception package accordingly.

130. The Evaluation Manager is responsible for ensuring that the evaluation team leader has addressed the comments made by WFP stakeholders. Any disagreement between the Evaluation Manager and the evaluation team leader have to be sorted out before the submission of the final inception package to WFP. Once he/she considers that the inception package meets the OEV expectations and quality standards in terms of both content and form, the Evaluation Manager submit the inception package to the OEV focal point, using the standard submission form.

Requesting interim payment and sharing final Inception Package with stakeholders

131. Upon OEV’s acceptance that the Inception Package is “complete”, the Evaluation Manager shares the final version with all WFP Stakeholders for information and submits the second invoice corresponding to 20% of the total contract value to [email protected] with a copy to the OEV focal point and OEV Senior Administrative Assistant (Kathryn Bell-Greco). Payment will be processed in line with the terms and conditions specified in the Long-Term Agreement. The evaluation team can now move on to the Evaluation Phase.

Page 35: EQAS for operations evaluations

33

4.3. Evaluation

132. The purpose of the evaluation phase is to collect additional data and information from interactions with local stakeholders, including beneficiaries and partners in order to build up the evidence trail and supplement the desk review findings with first-hand insights into the operation, its activities and results on the ground.

133. The evaluation phase starts with the evaluation mission preparation and is followed by the country mission, which lasts about three weeks.

134. In this phase, the main steps are as follows:

Table 4: Summary of Key steps – Evaluation Phase

Preparing the Evaluation mission

135. Sufficient time (i.e. at least three weeks) should be allowed between the finalisation of the IP and the evaluation mission for the Evaluation Manager and team to organise the country mission and for the Country Office and its partners to make the necessary administrative and logistics arrangements. Some of the below tasks should be initiated during the early stage of the inception phase.

136. The Evaluation Manager and team leader are responsible to ensure, before the team sets out to the country, that:

Plane tickets, visas, hotel reservations and insurances have been arranged by the company;

The Country Office has prepared a mission agenda in line with the IP requirements; is organizing meetings (including a briefing and debriefing) and site visits accordingly; organising in-country road transportation; and providing a list of translators, if required;

Phase 5: Dissemination & Follow-up

Phase 4: Reporting

Phase 3: Evaluation

Phase 2: Inception

Phase 1: Preparation

Page 36: EQAS for operations evaluations

34

The evaluation team has finalised the data generation tools (and has a plan for testing and adjusting them, if required); and established reporting templates and agreed reporting requirements for each team member;

Communication and coordination mechanisms during the field mission, including with the Evaluation Manager are clear.

137. Security considerations. As an ‘independent supplier’ of evaluation services to WFP, the evaluation company is responsible for ensuring the security of all persons contracted, including adequate arrangements for evacuation for medical or situational reasons. The consultants contracted by the evaluation company do not fall under the UN Department of Safety & Security (UNDSS) system for UN personnel.

138. However, to avoid any security incidents, the Evaluation Manager is requested to ensure that:

Travelling team members complete the UN system’s Online Basic Security in the Field (BSITF) training as well as for some countries the Advanced Security in the Field (ASITF) in advance, print out their certificates and take them with them. (These take a couple of hours to complete.)

The WFP CO registers the team members with the Security Officer on arrival in

country and arranges a security briefing for them to gain an understanding of the

security situation on the ground.

The team members observe applicable UN security rules and regulations – e.g.

curfews etc.

The evaluation mission

139. The mission usually starts with a meeting of the evaluation team to discuss work norms, work plan, etc. This is particularly important when team members have not worked together previously or even met each other before the start of the field work.

140. The first interaction of the team will be with the Country Office and an entry briefing allows reiterating mutual expectations, discuss the planned agenda and access to data and key informants. The date of and expectations from the internal and external exit debriefing sessions should be discussed and confirmed at the outset of the mission. The Country Office may also offer a presentation of the operation being evaluated as well as the findings of a self-assessment, if one has taken place before the team’s arrival.

141. The details of the field mission are determined by the methodology chosen for a given evaluation and will therefore differ for each evaluation. If required, evaluators are encouraged to consult specialised handbooks describing data collection methods.

142. Interactions in the capital city are usually required to meet with key institutional partners with which WFP interacts at strategic, coordination or implementation levels. The purpose of interviews is to obtain information and feedback that will ultimately contribute to the analysis of the operation performance.

143. These partners include government bodies, UN agencies, donors, national or international NGOs, and civil society. An attempt should also be made to arrange meetings with organisations that play an important role in food security but with whom WFP does not work directly.

144. Thereafter, the mission will travel to the provinces and districts for about two weeks. In order to maximise the time spent collecting field insights and cover more grounds, the mission may decide to split into smaller teams.

Click here for UNDSS online

security courses.

Page 37: EQAS for operations evaluations

35

145. At field level, the evaluation team will consult with WFP staff at area/sub-office or field-office level, officials involved in project selection or implementation, staff from cooperating partners, and beneficiaries and local communities. The mission will also visit different project sites and consult beneficiaries both individually and in focus groups.

146. WFP staff responsible for the organisation of field visits are likely to accompany the mission and the evaluation mission offers a key opportunity for WFP staff to learn from the team. To the extent possible, the evaluation team should use this period to promote reflection on operational planning and implementation. If evaluation stakeholders experience the evaluation process as useful and engaging, they are more likely to own evaluation findings and recommendations.

147. Yet, to safeguard the independence of the evaluation, WFP staff will not be part of the evaluation team or participate in meetings with external stakeholders if the evaluation team deems that their presence could bias the responses.

148. The ALNAP pilot guide on Evaluation of Humanitarian Action offers the following tips for the management of the mission and team by the team leader:

Spending the first field work day at the start of the mission, also allows to get a sense of team members strengths and weaknesses, including any potential bias.

Verify from the start the support that the office will provide to the team (e.g. office space, accommodation booking, local transportation, etc)

The team leader should share with the team templates for documentation such as persons-met lists, bibliographies and itineraries, formatted in the way the information is required. Getting inputs in a standard format minimises the work of converting and collating them.

Pair a stronger team member with a weaker one to help minimise quality problems.

149. The Evaluation Manager should contact separately the CO focal point and the evaluation team leader to take the pulse of how the mission is proceeding and check if any major issue has arisen.

Preparing the Exit Debriefing Sessions

150. Each evaluation team must prepare a powerpoint presentation that will support the exit debriefings, providing a summary of the team’s initial findings based on desk work and field observations. It will share early impressions, clarify any information gaps and highlight next steps. The preparation of this presentation should not be left until the end of the mission. Particular attention should be paid to the language used avoiding definite statements and final conclusions as findings at this stage are preliminary.

151. There is no template for the exit debriefing powerpoint presentation but an indicative table of contents is shown in the box below. The powerpoint presentation will serve as a reference document to stakeholders but will not be reviewed, commented on or revised.

Page 38: EQAS for operations evaluations

36

Writing of the presentation should start early, and the document should be updated and refined as the mission unfolds.

The presentation should not exceed 20 slides.

The presentation should be quality assured by the Evaluation Manager before he/she sends it to the stakeholders.

The presentation should be shared at least 24 hours before with the participants.

Holding Exit Debriefing Sessions

152. Debriefing sessions provide an opportunity to engage stakeholders, encourage reflection and are an important step for data validation. There will usually be three distinct debriefings to present the initial evaluation findings to the CO and its partners:

Bilateral meeting with CO management. A bilateral meeting between the evaluation team and the CO management should take place before the internal debriefing to provide an opportunity to discuss the preliminary findings and get the CO’s management insights.

Internal debriefing. This debriefing will take place in the CO and will be attended by CO staff (including sub-offices if possible). In addition, the Evaluation Manager, Regional Bureau colleagues and the OEV focal point will participate via teleconference. The Evaluation Manager will facilitate the debriefing by sending the invitations, sharing the powerpoint presentation, introducing the meeting (clarifying its purpose) and concluding on the next steps. The evaluation team leader and team members will be responsible for delivering the presentation of their initial findings. The debriefing should take overall 1.30 hour, with a presentation by the team of maximum 30 minutes in order to leave sufficient time for questions and answers. It is expected that the dialogue between the evaluation team and the CO will continue beyond this teleconference.

External debriefing with key local partners. This debriefing should be held after the internal one and take comments raised in the internal debriefing into consideration before presenting to an external audience.

153. The date of and invitation to the exit debriefing should be fixed at the outset of the evaluation mission with the understanding that the powerpoint presentation will follow in due course. The Evaluation Manager should send the invitation for the internal debriefing to CO, RB and OEV staff and ensure that the CO sends the invitation for the external debriefing to the external stakeholders at least two weeks in advance. When sending the meeting

Page 39: EQAS for operations evaluations

37

invitation, it is good practice for the Evaluation Manager to remind participants of the purpose of the exit debriefing and set the expectations accordingly.

154. To enhance learning, the debriefing may also take the form of a workshop to be led by the evaluation team leader and following the powerpoint presentation. Related costs should be covered by the CO.

When possible, the evaluation team may consider organizing a meeting with CO staff mid-way through the mission to share early impressions, seek clarifications and if needed adjust the data collection methods or the list of stakeholders to be interviewed.

The evaluation team may hold a one-day meeting at the end of the field work to consolidate their findings and decide on the next steps.

Page 40: EQAS for operations evaluations

38

4.4. Reporting

155. The purpose of the reporting phase is to finalise the analysis of the data gathered since inception and bring it together in a concise analytical report, which meets the users’ needs and OEV quality standards.

156. The main output of the reporting phase is the Evaluation Report (ER), which presents the main findings (evidence-based), conclusions and recommendations of the evaluation in an accessible manner. The ER matrix of comments is another output of this phase.

157. The reporting phase lasts between two to three months depending on the complexity of the operation and the quality of the draft evaluation report. In this phase, the main steps are as follows:

Table 6: Summary of Key steps – Reporting Phase

Drafting the evaluation report

158. The evaluation team is responsible for preparing the draft evaluation report in line with the corresponding content guide (provided in section 5.1.2) and template (provided in section 5.3.2) and to submit it to the Evaluation Manager as per agreed timeline. It is recommended for the team to hold meetings after field work and before the completion of the first draft of the report to discuss as a group the evidence, findings, conclusion and recommendations.

Page 41: EQAS for operations evaluations

39

159. During this period, it is important that the evaluation team maintains the dialogue with the CO and RB to enhance mutual understanding of related issues. While specific requests for clarifications can be handled via email, the evaluation team should hold a teleconference with the CO and RB to share and validate the main areas of recommendations. The Evaluation Manager may decide on when is best to hold this call, whether before or after the draft report is quality assured.

160. Box 3 below presents the OEV expectations from evaluation reports and outlines key quality criteria. These should be taken into consideration by the evaluation team when drafting the evaluation report.

Box 3 – Quality criteria for evaluation reports

Quality criteria for evaluation reports

A quality evaluation report should:

Represent a thoughtful, well-researched and well-organised effort to objectively evaluate what worked in the operation, what did not and why.

Document the methods used in the evaluation to allow readers to judge its reliability. Address all evaluation questions included in the TOR. Convey the results of the evaluation in a way that corresponds to the information needs

of the intended users (i.e. do not dwell on issues of limited values to the intended users).

Present findings supported by credible evidence (rather than opinion) in response to the evaluation questions and is transparent about facts borne out by evidence and assumptions or plausible associations drawn from evidence.

Present output and outcome data which has been cross-checked and validated by the ET and using OpEv sample models (available here).

Findings should take into consideration different stakeholder groups, including gender representation. Findings should be examined with the appropriate level of gender analysis. Be balanced and ensure that shortcomings are presented in a constructive manner to enhance stakeholders’ acceptability and overall utility of the report.

Be coherent and free from internal contradictions. Draw conclusions flowing logically from the analysis of data and the presentation of

evidence-based findings. Propose a limited number of concrete, realistic (implementable), targeted and

prioritised recommendations derived from the conclusions. Include a recommendation(s) on strengthening gender responsiveness and/or address

gender dimensions within recommendations (as appropriate). Contain a concise executive summary setting out the key points and drafted in such a

way that it can be understood without having to refer to the rest of the report. Be well structured and written in an adequate, clear, concise and accessible manner;

avoid jargon and difficult technical terms; and use tables and diagrams for the presentation of key data.

Meet layout and formatting requirements.

Page 42: EQAS for operations evaluations

40

Note

As the evaluation report will be circulated for comment only once, it should be:

Complete including the executive summary and annexes.

Edited as evaluation users may have little patience with difficult and time-consuming language.

Fully formatted, including numbering for pages, paragraphs, figures and tables, etc.

Quality assurance of the draft evaluation report

161. The Evaluation Manager is responsible to ensure that an objective and rigorous quality assurance of the draft evaluation report is undertaken by the company before the report can be circulated to stakeholders for comments.

162. This is particularly important given that in the outsourced model for Operation Evaluations, quality assurance is entirely delegated to the company. The company is thus fully responsible for the quality of the evaluation products delivered.

163. The quality assurance process may be conducted by the Evaluation Manager and/or complemented by technical experts, if required. These should be mobilised independently and at no additional cost to WFP. The quality assurance process should ensure that the evaluation report meets the OEV expectations and quality standards in terms of both content and form, i.e. that it meets the requirements and standards set out in the content guide, and follow the template, providing high quality information in each section.

164. The ER quality checklist (see section 5.3.2) can be used as a guide to help assess the quality of the report, structure the comments and provide systematic and constructive feedback to the evaluation team. The quality assurance process should also be informed by box 3 above and the criteria based on which the evaluation report will be assessed and rated as part of the post-hoc quality assessment of evaluation reports (see table 5 below).

165. If shortcomings exist, the Evaluation Manager should request the team to improve the draft until it meets the OEV quality standards.

Circulating the draft report to stakeholders

166. The Evaluation Manager is responsible for circulating the evaluation report to stakeholders for comments (including the OEV focal point). This should only occur once the company certifies that the draft report fully meets the OEV quality standards and users’ needs.

167. To ease the review process, it is recommended to:

Establish in advance a circulation list in collaboration with the evaluation team, the CO and RB focal points and discuss with the Country Office the modalities and timing to elicit external stakeholders’ comments.

Allow sufficient time –two weeks - for stakeholders to read and provide comments in writing on the draft report. Sending a reminder a few days before the deadline may be necessary.

Let people know that their comments will be consolidated in a matrix of comments, responded to and circulated back to all reviewers so as to encourage considered feedback.

Provide guidance to the reviewers so that their comments focus on the accuracy of information presented.

Circulate the draft report as a PDF file rather than a word file and ask people to use the ER comments matrix to record their comments. This avoids people embedding comments as track changes, which can cause layout and pagination changes and create confusion and will ease the collation of comments.

Page 43: EQAS for operations evaluations

41

Preparing the ER comments matrix

168. The Evaluation Manager is responsible for preparing the evaluation report comments matrix (provided in section 5.2.3). This document consolidates all stakeholders’ comments, listing for each the name of the commentator as well as the related section and paragraph of the report, and serves to guide the finalisation of the report.

169. To ease the revision process, it is recommended for the Evaluation Manager to:

Go back to the commentator for clarification if comments are ambiguous.

Group all comments on a particular section so that commonalities or dissonances are apparent. (Using the excel template allows to filter accordingly.)

Discuss the main comments with the evaluation team to deepen mutual understanding of the issues raised.

Finalising the Evaluation Report

170. Stakeholders’ comments are important to correct factual errors, avoid misrepresentation and provide additional information, as required. Given the independent nature of the evaluation, the evaluation team should exercise judgment as to which comments merit follow-up.

As stakeholders are keen to ensure that they have not been misinterpreted, comments on Operation Evaluations reports tend to be numerous.

The evaluation team should be provided sufficient time to work through the comments systematically and finalise the evaluation report accordingly.

171. The team leader is also responsible to complete the ER comments matrix (evaluation team response column) specifying how comments were addressed in the revised draft. The team responses should indicate the specific revisions (including references to corresponding paragraphs) made to the evaluation report in response to the stakeholders’ written comments.

172. The Evaluation Manager should ensure that the stakeholders’ comments were given due consideration by comparing the first and revised drafts and reviewing the team’s responses to each comment. Particular attention should be paid to the comments that have been rejected by the team to ensure that a solid rationale has been provided to stick to the initial findings or formulation.

173. If the revised evaluation report does not sufficiently address relevant stakeholders’ concerns, the Evaluation Manager should ask the evaluators to improve the report until it does and can be considered final, as certified by the company’s Director.

When circulating the report, it is recommended to advise the stakeholders on what to keep in

mind while reviewing the report:

Is the data accurate? If not, please provide additional materials for consideration.

Are the findings consistent with the data presented?

Do the conclusions logically follow the findings?

Are recommendations feasible and flowing logically from the findings and conclusions?

Are there significant gaps in the data presented?

Is the data presented in a clear, unbiased and objective manner?

Page 44: EQAS for operations evaluations

42

Note

Upon contracting, the evaluation company consents to producing outputs complying with OEV quality standards. Hence the number of contractual days agreed upfront for producing the report will not be increased if additional days were needed to attain the expected quality.

Submitting the final report for approval

174. Once the evaluation report has been finalised, the Evaluation Manager should submit it for approval to the OEV focal point alongside the ER comments matrix.

175. The report submission should include a cover note from the company’s director certifying that the final report meets OEV’s quality standards and the needs of the intended users (template provided in section 5.4.3). Upon OEV’s approval, the final report will be shared by the OEV focal point with the main users (CO and RB).

Requesting final payment

176. Upon OEV’s confirmation that the evaluation report has been approved, the Evaluation Manager will send the third and final invoice corresponding to 60% of the total contract value to [email protected] with a copy to the OEV focal point and OEV Senior Administrative Assistant (Kathryn Bell-Greco). Payment will be processed in line with the terms and conditions specified in the Long-Term Agreement.

Post-hoc quality assessment of the evaluation report

177. The OEV focal point is responsible to ensure that the quality of the final evaluation report is externally assessed. This post-hoc quality assessment will be applied to all operations evaluation reports with a view to objectively review and rate them against international evaluation standards.

178. A set of quality criteria by which the evaluation reports will be assessed is listed in Table 5 below. 17 Each criterion is broken down into a set of sub-criteria and weighted. Weights are based on the relative importance of each criterion on a scale from 1-10 (1 being the least important and 10 being the most important). These weightings are intentionally biased towards utilization (i.e. the criteria with the most utility – recommendations, for example – are weighted higher). The overall rating of each evaluation will be presented as a percentage.

179. The post-hoc quality assessment report will be shared with the company contracted for the evaluation and discussed with the Evaluation Manager during the mutual feedback session. The Evaluation Manager will in turn share the post-hoc quality assessment report with the signatory of the evaluation report submission form, as well as with the evaluation team members in order to contribute to further quality enhancements of future evaluations. OEV will also share the results from the post-hoc quality assessments with COs and RBs as well as publish them on the OpEv extranet and WFP Go.

17 The evaluation criteria have been defined based on the methodology developed by the 2013 OECD/DAC Peer review of the WFP evaluation function and were subsequently adapted. These were drawn from the quality criteria most often cited from various agencies and networks including: ALNAP: Quality Proforma; UNFPA: Evaluation Quality Assessment Grid; UNIFEM: Quality Criteria for Evaluation Reports; OECD DAC Guidelines: Quality Standards for Development Evaluation; UNICEF: Adapted UNEG Evaluation Report Standards; WFP: Evaluation Quality Assurance System Guidance; and UNEG: Quality Checklist for Evaluation Reports.

Page 45: EQAS for operations evaluations

43

Table 5 – Quality criteria and weighting

RATING

The following rating scale is used for the quality assessment of evaluation reports:

I. GENERAL INFORMATION – Weighting: 1

The following information is easily accessible in the first few pages of the report: 1. Heading including name of organization 2. Evaluation title 3. Thematic area 4. Geographic Area (Region, Country, or global) 5. Type of Evaluation (Operation Evaluation and (final/ midterm) 6. Period being evaluated and list of operations covered with budget 7. Name(s), Organization(s), and Contact Information of Evaluator(s) 8. Name and Address of Organization commissioning the evaluation 9. Date of Evaluation Submission 10. Table of contents listing Tables, Graphs, Figures and Annexes 11. List of acronyms

II. ACCESSIBILITY / CLARITY – Weighting: 3

1. The report is logically structured with clarity and coherence (e.g. background and objectives are presented before findings, and findings are presented before conclusions and recommendations)

Presentation 2. Table of contents includes a logical outline of the sections presented. 3. Material referenced in Table of Contents is easy to locate and page numbers are accurate. 4. Visual aids, such as maps and graphs convey key information. 5. The language is adequate in quality and tone for an official document. 6. The report is free from any grammar, spelling, or punctuation errors

Clarity of reporting 7. Clear, precise and professional language is used. 8. The report reflects correct us e of terminology. 9. Correct grammar is used. 10. The report is highly reader friendly. 11. Useful graphs and tables are included.

Page 46: EQAS for operations evaluations

44

III. EXECUTIVE SUMMARY – Weighting: 5

1. The executive summary is coherent and concise The executive summary includes:

2. Overview of the Evaluation including brief description, context, present situation 3. Evaluation purpose 4. Objectives 5. Intended audience 6. Evaluation methodology (including rationale, sources, data collection and analysis methods used,

major limitations) 7. Most important findings. 8. Main conclusions and recommendations. 9. Clear linkages between the recommendations and findings.

IV. EVALUATION CONTEXT – Weighting: 3

1. The evaluation provides an overview of the country in which the operations are taking place, including the gender context.

2. Where relevant, the evaluation includes data on poverty, food security to provide an understanding of the hunger situation.

3. Other subject/sector relevant information is included which could have enhanced or inhibited WFP’s work.

V. EVALUATION PURPOSE and SCOPE – Weighting: 3

The purpose of the evaluation includes the following: 1. Why the evaluation was needed at that point in time 2. Who needed the information, 3. What information is needed 4. How the information will be used 5. Evaluation objectives are clearly stated 6. Balance between accountability and learning is clearly described. 7. Evaluation objectives and balance between accountability & learning ( as in Exec Summ above) 8. The report describes and provides an explanation of the chosen evaluation criteria, performance

standards, or other criteria used by the evaluators ( such as OECD/DAC) 9. The report includes the Terms of Reference 10. The report describes and justifies what the evaluation does and does not cover.

VI. EVALUATION METHODOLOGY – Weighting: 5

1. The report lists the Evaluation Questions. 2. The report describes the data collection methods 3. The rationale for selecting particular data collection and analysis methods is explained. 4. The report describes the sampling frame (area and population represented, rationale for selection,

numbers selected out of potential subjects, and limitations of the sample). 5. The methodology presented allows for triangulation. 6. The methods employed are appropriate for the evaluation to answer its questions. 7. The limitations of the methodology are explained, including how they were addressed, and the

potential impact on evaluation findings. 8. The report describes the evaluation’s approach to addressing gender issues. 9. The evaluation design includes ethical safeguards (i.e. protection of confidentiality, rights and

welfare of human subjects, and respect of the values of the beneficiary community).

VII. FINDINGS AND CONCLUSION – Weighting: 10

1. The findings have been formulated clearly and are based on evidence collected 2. Findings should take into consideration different stakeholder groups, including gender

Page 47: EQAS for operations evaluations

45

representation. Findings should be examined with the appropriate level of gender analysis 3. The findings are triangulated. 4. The findings address any limitations or gaps in the evidence and discuss any impacts on

responding to evaluation questions. 5. The findings address evaluation criteria chosen. 6. The conclusions provide answer to the evaluation questions.

VIII. ANALYSIS – Weighting: 10

1. Reasons for outcomes of the subject being evaluated, especially enabling and constraining factors, are identified to the extent possible.

2. The analysis responds to all evaluation questions. 3. The analysis examines cause and effect links between an intervention and the outcomes. 4. External and contextual factors are identified including the social, political or environmental

situation 5. The analysis includes unintended impacts/ or consequences of activities 6. The report assesses if the design of the object was based on a sound gender analysis

IX. RECOMMENDATIONS – Weighting: 10

1. Recommendations are relevant to the object and purpose of the evaluation. 2. The recommendation are based on the analysis/conclusions and substantiated by evidence

collected. 3. The evaluation report include recommendation (s) on strengthening gender responsiveness

and/or address gender dimensions within recommendations (as appropriate) 4. The recommendations are specific, realistic, and actionable 5. The recommendations are clustered and prioritized 6. Recommendations clearly identify the target group for each recommendation 7. Recommendations reflect an understanding of the commissioning organization and potential

constraints to follow up.

Page 48: EQAS for operations evaluations

46

4.5. Dissemination and Follow-up

180. The purpose of this phase is to ensure that the evaluation report is made available to the concerned audience with a view to enhance its utility and that concerned stakeholders take stock of the evaluation process to improve the Operation Evaluation model and future Operation Evaluations.

181. In this phase, the main tasks are to:

Table 7: Summary of Key steps – Dissemination and Follow-up Phase

Management Response

182. The Evaluation Policy specifies that a management response to each evaluation will be prepared. A management response matrix specifies the actions that the CO/RB management have agreed to take to address each evaluation recommendation along with the corresponding implementation deadlines.

183. The Management Response will be prepared within two weeks of the finalisation of the Evaluation Report. Its preparation will be completed largely by the Country Office management and overseen by the Regional Bureau management.

184. In order to enhance accountability and learning, the evaluation recommendations and corresponding management responses are recorded by OEV in WFP corporate management response database (ACE database). Given the RB’s programme support and oversight role towards the CO, the RB is expected to track the implementation of the follow-up actions to the evaluation recommendations. This will allow the RB to analyse the nature and trends of recommendations on operations in its region and use this to inform programme support.

PROCESS STEPS - DISSEMINATION & FOLLOW-UP

PHASE

Estimated

duration

Evaluation

Manager

Evaluation

Team

Country

Office

Regional

Bureau

Other

stakeholdOEV

1 Request CO for Management Response (MR)

2 Prepare MR

3 Review and endorse MR

4 Enter the MR in the ACE database

5Publish final ER, MR and post-hoc quality assessment

overall rating on WFP website

6 Post final ER on UNEG and ALNAP databases

7 Disseminate the evaluation findings at CO/RB levels

8 Conduct an evaluation feedback e-survey

9 Hold a mutual feedback session

10 Prepare an annual synthesis of operation evaluations

11Track the implementation of the recommendations

2 weeks

Phase 5: dissemination &

Follow-up

Phase 4: Reporting

Phase 3: Evaluation

Phase 2: Inception

Phase 1: Preparation

Page 49: EQAS for operations evaluations

47

Report publication

185. The WFP Evaluation Policy specifies that evaluation reports are public documents. The OEV focal point will post the evaluation report, the management response and the overall results from the post-hoc quality assessment on the WFP evaluation website. Where relevant, the report may also be published on the website of other divisions/units and of the country office, if it has one. Dissemination of findings

186. In line with the evaluation policy and to enhance the utility of the evaluation, the evaluation findings will be actively disseminated. The OEV focal point will:

Send the evaluation report (and/or the executive summary, as appropriate) to the relevant internal stakeholders including directors and chiefs of units, programme advisors, regional bureau staff and OEV colleagues.

The dissemination strategy will consider from the stakeholder analysis who to disseminate to, involve and identify the users of the evaluation, duty bearers, implementers, beneficiaries, including gender perspectives.

Integrate any lessons, as applicable into OEV’s “top ten lessons” papers, as applicable.

Resort to creative dissemination methods to further disseminate the evaluation and stimulate discussions, as applicable.

Make the evaluation report more accessible to the broader evaluation and humanitarian community by uploading it onto the UNEG and ALNAP evaluation reports databases.

187. The CO and RB will share the ER executive summary with in-country and regional external stakeholders (including from Government, partner organisations, etc.) and disseminate key lessons from the operation evaluation to local stakeholders, ideally down to the beneficiary level, if relevant.

Feedback survey

188. Each evaluation concludes with an online feedback survey sent out by the OEV focal point seeking to document systematically what went well or could have been improved, gather feedback on the engagement and performance of the various stakeholders as well as on the quality and utility of the evaluation as perceived by its intended users.

189. Results of the survey will be kept internal to OEV. The results will help OEV to improve the quality of the Operation Evaluations as well as inform necessary adjustments to the new Operation Evaluations approach and related processes and tools.

190. The survey will take about 20-30 minutes to complete and includes both quantitative ratings of different aspects of the evaluation process and options to provide qualitative responses to open questions. The survey will be undertaken with external support and will be completed by the evaluation manager and team members, CO, RB and OEV focal points.

191. The data will be compiled into an individual OpEv report, which will be kept strictly internal to OEV and confidential, but will inform a mutual feedback session with the Evaluation Manager at the end of the evaluation process. The individual survey reports will also feed into a summary report across all OpEvs commissioned in the last 12 months. No reference to specific individuals or evaluation firms will be made in the summary report.

Page 50: EQAS for operations evaluations

48

Mutual feedback session

192. As part of the evaluation close-out, the survey will also inform a mutual feedback session to be held between the OEV focal point and the evaluation manager to discuss what went well or could have been improved, the performance of the evaluation manager and the team members and lessons learnt in relation.

Annual synthesis of Operation Evaluations

193. The evaluation findings will feed into an annual synthesis of Operation Evaluations to be prepared by OEV that will report about key findings and common themes emerging across the series of Operation Evaluations conducted that year and about the quality of the Operation Evaluations as externally assessed through the post-hoc quality assessment mechanism.

194. This annual synthesis report will be presented to the WFP Executive Board by the OEV Director. A specific management response to the synthesis report will be prepared under the coordination of the Performance Management and Monitoring Division and the implementation of recommendations will be tracked overtime by the same division.

4.6. Summary of process steps by evaluation phase Table 8: Summary of Key steps – All evaluation phases

Page 51: EQAS for operations evaluations

49

PROCESS STEPS - EVALUATION PHASEEstimated

duration

Evaluation

Manager

Evaluation

Team

Country

Office

Regional

Bureau

Other

stakeholdOEV

1 Prepare the evaluation mission

2 Hold an evaluation team meeting

3 Hold a briefing with CO

4 Conduct field work

5 Maintain regular contact with CO and ET

6 Prepare the exit debriefing presentation

7 Conduct QA of exit debriefing presentation

8 Hold bilateral meeting with CO management

9 Hold internal exit debriefing

10 Hold external exit debriefing

PROCESS STEPS - REPORTING PHASEEstimated

duration

Evaluation

Manager

Evaluation

Team

Country

Office

Regional

Bureau

Other

stakeholdOEV

1 Draft the evaluation report (ER)

2Hold teleconference to discuss areas of

recommendations

3 Revise the draft ER

4 Conduct QA of the draft ER

5If necessary, repeat steps 3 and 4 until satifactory

quality is reached

6If ER quality is satisfactory, send the draft ER to CO,

RB and OEV for comments

7 Comment on draft ER 2 weeks   

8Prepare the comments matrix, share and discuss it

with the team

9 Revise the ER and complete comments matrix

10Check the revised ER adequately addresses

stakeholders' comments

11If necessary, repeat steps 3 and 4 until satisfactory

quality is reached

12Submit the final ER and comments matrix for

approval

13 Approve the final ER 1 week

14 Submit invoice for third and final payment

15 Submit final ER for post-hoc quality assessment

16 Share post-hoc quality assessment report with EM

17Review post-hoc quality assessment report with

team members

PROCESS STEPS - DISSEMINATION & FOLLOW-UP

PHASE

Estimated

duration

Evaluation

Manager

Evaluation

Team

Country

Office

Regional

Bureau

Other

stakeholdOEV

1 Request CO for Management Response (MR)

2 Prepare MR

3 Review and endorse MR

4 Enter the MR in the ACE database

5Publish final ER, MR and post-hoc quality assessment

overall rating on WFP website

6 Post final ER on UNEG and ALNAP databases

7 Disseminate the evaluation findings at CO/RB levels

8 Conduct an evaluation feedback e-survey

9 Hold a mutual feedback session

10 Prepare an annual synthesis of operation evaluations

11Track the implementation of the recommendations

6 weeks

2 weeks

3 weeks

2 weeks

Page 52: EQAS for operations evaluations

50

5. CONTENT GUIDES AND TEMPLATES

195. The purpose of the content guides and templates is to assist the evaluation teams in preparing for, and drafting the evaluation outputs with a clear sense of what is expected. These documents also aim to support the company’s quality assurance process of the evaluation outputs by providing a transparent reference as to what is expected.

196. The content guides provide section by section guidance for the inception package and evaluation reports and define the structure that these reports should follow. Sections in the IP were designed to ensure that the evaluation method is well grounded and that the operational plan for the evaluation is appropriate. Sections in the ER were designed to ensure that the evaluation responds to the questions it set out to answer and draws clear conclusions.

197. The quality checklists should be used by the Evaluation Manager as a guide for assessing and ensuring the quality of all evaluation products ahead of their submission to WFP. They can also be used as a tool to structure the comments and provide systematic and constructive feedback to the evaluation team.

198. The report templates seek to ensure that all key sections are addressed in line with the content guide. They also aim to enhance the consistency in the operation evaluations commissioned by OEV. Evaluation teams are expected to follow the templates and provide information for each of the foreseen sections.

199. The process templates include templates for the evaluation proposal; the evaluation report comments matrix; as well as templates for the official submission to OEV of the evaluation products. The templates are reproduced here for the sake of completeness of information. They are also provided in electronic version on the Operation Evaluations Extranet.

Page 53: EQAS for operations evaluations

51

5.1. Content guides

5.1.1. Content guide for the Inception Package

The Content guide for the Inception Package is available at this link.

5.1.2. Content guide for the Evaluation Report

The Content guide for the Evaluation Report is available at this link.

5.2. Quality checklists

5.2.1. Inception Package Quality checklist

The Inception Package Quality checklist is available at this link.

5.2.2. Evaluation Report Quality checklist

The Evaluation Report Quality checklist is available at this link.

Page 54: EQAS for operations evaluations

52

5.3. Evaluation Products Templates

5.3.1. Inception Package Template

Page 55: EQAS for operations evaluations

53

5.3.2. Evaluation Report Template

Page 56: EQAS for operations evaluations

54

Acknowledgements

[include acknowledgement here]

Disclaimer

The opinions expressed are those of the Evaluation Team, and do not necessarily

reflect those of the World Food Programme. Responsibility for the opinions

expressed in this report rests solely with the authors. Publication of this document

does not imply endorsement by WFP of the opinions expressed.

The designation employed and the presentation of material in the maps do not imply

the expression of any opinion whatsoever on the part of WFP concerning the legal or

constitutional status of any country, territory or sea area, or concerning the

delimitation of frontiers.

Evaluation Commissioning

Evaluation Manager: [name, evaluation firm]

Evaluation focal point: [name], WFP

Operations Evaluations Project Manager: [name], WFP

Page 57: EQAS for operations evaluations

55

Page 58: EQAS for operations evaluations

56

5.3.3. Evaluation Report Comments Matrix Template

Reviewers: John Smith, ETH CO HoP, Janine Bianca, OMN Prog Advisor

#Reviewer Para Section Stakeholder comment

OEV focal point's comments and

amendments

1 JS 5 2.1 Added : The current PRRO is due to end in June 2015,

in tandem with the current phase of the Productive

Safety Net Programme (PSNP), therefore, the Country

Office will be re-designing a new PRRO phase in 2014,

encompassing a new PSNP which will in turn impact

humanitarian relief assistance since there are initial

discussions to include all chronically food insecure

caseloads (i.e. those that have received relief for the

previous three or five years) in a new PSNP. These

developments will then impact the parameters of a

new PRRO post-June 2015.

Accepted

2 JB 7 2.3 Added : follow-on projects in Ethiopia as well as Accepted

3 JS Deleted : see table 1 for preliminary description of

each stakeholder group

Accepted

4 JS 8 2.3 Added : the Government (and Accepted

5 JS 11 3 Deleted : builds on its predecessor (PRRO 106650) and.

Comment : I don't see the value of mentioning the old

prro that ended dec 2011 and has already been

evaluated, docs are available for review.

Rejected. The PRRO builds on a previous

operation, i.e. was not designed from

scratch. In fact, there was an evaluation of

this operation and the evaluation team

will need to review the extend to which

the recommendations were taken into

consideration in the design of the follow-

up PRRO.

Matrix of comments OpEv. ETHIOPIA PRRO - Terms of Reference

Page 59: EQAS for operations evaluations

57

5.4. Evaluation process templates

5.4.1. Evaluation Proposal Template

Page 60: EQAS for operations evaluations

58

Page 61: EQAS for operations evaluations

59

Page 62: EQAS for operations evaluations

60

Page 63: EQAS for operations evaluations

61

Page 64: EQAS for operations evaluations

62

Page 65: EQAS for operations evaluations

63

5.4.2. Inception Package Submission Template

Page 66: EQAS for operations evaluations

64

5.4.3. Evaluation Report Submission Template

Page 67: EQAS for operations evaluations

65

Acronyms

ALNAP Active Learning Network for Accountability and Performance CD Country Director CO Country Office DCD Deputy Country Director DRD Deputy Regional Director EB Executive Board DSA Daily Subsistence Allowance EM Evaluation Manager EQAS Evaluation Quality Assurance System ER Evaluation Report HQ Headquarters HR Human Resources IP Inception package NGO Non-Government Organization LTA Long Term Agreement OE Office of Evaluation OECD/DAC Organisation for Economic Co-operation and Development,

Development Assistance Committee RB Regional Bureau RD Regional Director RMP Division for performance management RPA Regional Programme Advisors SER Summary Evaluation Report TOR Terms of Reference UNDSS United Nations Department for Safety and Security UNEG United Nations Evaluation Group WFP World Food Programme

Page 68: EQAS for operations evaluations

Office of Evaluation

www.wfp.org/evaluation

operation.evaluation.wfp.org

Ro

me

, Oc

tob

er

20

15