Workshop: Monitoring, evaluation and impact assessment

38
partnership Ÿ excellence Ÿ growth Monitoring, evaluation and impact assessment C. Crissman Science Week Workshop 22 July 2011, Penang Photo: A. Gordon

Transcript of Workshop: Monitoring, evaluation and impact assessment

Page 1: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Monitoring, evaluation and impact assessment

C. CrissmanScience Week Workshop

22 July 2011, Penang

Photo: A. Gordon

Page 2: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Workshop objectives

•Introduce Monitoring and Evaluation in Results Based Management

•Introduce the identification of outcomes in projects

Page 3: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

What is Results Based Management?goal of RBM : “…a management strategy aimed at achieving important changes in the way organizations operate, with improving performance in terms of results…”

purpose is “…to improve efficiency and effectiveness through organizational learning, and secondly to fulfill accountability obligations through performance reporting.”

key success factors in RBM are:•“the involvement of stakeholders throughout the management lifecycle in defining realistic expected results, •assessing risk, •monitoring progress, •reporting on performance and •integrating lessons learned into management decisions.”

Source: Meyer 2003

Page 4: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

RBM rests on four pillars

1. Planning: defining clear and measurable results and indicators, based on a logic model or framework.

2. Monitoring: measuring and describing progress towards results, and the resources consumed, using appropriate indicators.

3. Reporting, internally and externally, on progress towards results.

4. Managing: using results information (and evaluation) for lesson-learning and management decision making.

Source: Meyer, 2003

Page 5: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

“…is deliberately results driven, meaning that the drivers for planning will be real-world impacts—the outcomes that make a difference to global development goals…”

“…is a results-oriented research system—in contrast with, for instance, a results oriented development program…”

5

The CG ConsortiumStrategy and Results Framework

Page 6: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

CG use of RBM tools in Planning, Monitoring and Evaluation

– Strategic Plan

– Medium-term Plan• Review of MTP before implementation

• Performance measurement after implementation

– Performance Measurement System

Research Indicators

• Outputs (Output targets, publications)• Outcomes• Impact

– Center Commissioned External Review (a program-level evaluation)

– EPMR (a center-level evaluation)

Page 7: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

The results chain

Source: ADB, 2006

Page 8: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

WorldFish Strategic PlanResearch in Development

Reduce poverty and vulnerability through fisheries and aquaculture.

Increase food and nutrition security through fisheries and aquaculture.

Our Strategic Results

Page 9: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Multiple dimensions of poverty

Vulnerability

Income and assetpoverty

Marginalization

Certain groups are systematically disadvantaged due to discrimination based on: gender, ethnicity, race, religion, caste,

age, HIV status, migrant status

People’s exposure to risks; sensitivity of livelihoods to

risks; capacity to use assets and capabilities to cope and adapt

Little access to means to make a decent standard of livingA complex problem…

Page 10: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Interventions to reduce poverty

Vulnerability

Income and assetpoverty

Marginalization

Organisational development,labour rights, migrant’s rights,gender equity

Improve access to health services, secure land rights,aquatic property rights

Diversification, microfinance, education & skills

…requiring complex solutions

Page 11: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Selecting paradigms for implementation

AR4D

Page 12: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Strategic Plan – Implementation strategy

Page 13: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Outputs - change in knowledge- change in capacity- change in technology - change in materials- change in policy options- change in awareness/understanding

Research - recognition/appreciation of research knowledge Outcomes - use of knowledge by partners

- mobilisation of new capacity- extension of technology/materials- change in policy environment

Impacts - change in problem- change in opportunities

Development - change in actions/behaviour of stakeholdersOutcomes - change in productivity

- change in equity/empowerment- change in market conditions - change in investments- change in security of assets/habitats

Page 14: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Ex-post Impact AssessmentMeasuring ‘the change in the problem’

Reduce poverty and

vulnerability through

fisheries and aquaculture.

Increase food and nutrition

security through fisheries

and aquaculture.

Page 15: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Did we make it to happen?

Page 16: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Following a Recipe A Rocket to the Moon Raising a Child

• Formulae are critical and necessary

• Sending one rocket increases assurance that next will be ok

• High level of expertise in many specialized fields + coordination

• Rockets similar in critical ways

• High degree of certainty of outcome

• Formulae have only a limited application

• Raising one child gives no assurance of success with the next

• Expertise can help but is not sufficient; relationships are key

• Every child is unique

• Uncertainty of outcome remains

Complicated Complex

•The recipe is essential

•Recipes are tested to assure replicability of later efforts

•No particular expertise; knowing how to cook increases success

•Recipes produce standard products

•Certainty of same results every time

Simple

(Diagram from Zimmerman 2003)

Page 17: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

SIMPLE COMPLICATED COMPLEX

What works?

What works in what contexts? (implementation environments and participant characteristics)

What works here and now?

What do we mean by ‘works’?

What should be monitored?

Page 18: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Types of interventions

Simple intervention Complicated or complexintervention

Single causal strand.

Intervention is sufficient to produce the impacts

Multiple simultaneous causal strands required to produce the impacts

Universal mechanism

Intervention is necessary to produce the impacts

Different causal mechanisms operating in different contexts

Linear causality, proportional impact

Recursive, with feedback loops, leading to disproportionate impact at critical levels

Pre-identified outcomes Emergent outcomes

Page 19: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Simple Complicated Complex

Deciding impacts

Likely to be agreed Likely to differ, reflecting different agendas

May be emergent

Describing impacts

More likely to have standardised measures developed

Evidence needed about multiple components

Harder to plan for given emergence

Analysing cause

Likely to be clear counter-factual

Causal packages and non-linearity

Unique, highly contingent causality

Reporting Clear messages Complicated message

Uptake requires further adaptation

Challenges for impact assessment

Page 20: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

M&E system interaction with project implementation

Monitoring captures what happened

Evaluation explains why

Page 21: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

M&E Implementation strategy

Page 22: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Evaluations and the R4D Results Chain

Scale

Pilo

t / S

mal

lG

loba

l

Research

Unit ofImpact

Analysis

Proj

ect

Syst

emPr

ogra

m

Outcome evaluation studies that measure

the effect size

Outcome evaluation that measures the scale of output

adoption/ uptake

Time

Input Output Outcome Impact

Objectives Goals

Program M&E, Impact pathway analysis,

Adoption constraints analysis

Ex post Impact Assessment as a

function of

(effect size * scale)

Source: taken from a presentation by M. Maredia

Page 23: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Evaluations

Process implementation evaluationWhat did or did not get implemented as planned?

Rapid Appraisalsprovide timely, relevant information to decision-makers on pressing issues they facein the project and program setting.

The aim of applied research is . . .to facilitate a more rational decision-making process in real-life circumstances

Impact Evaluation … the classic evaluation that attempts to find out the changes that occurred, and towhat they can be attributed

Performance Logic Chain AssessmentThe performance logic chain assessment evaluation strategy is used to determine the strength and logic of the causal model behind the policy, program, or project.

Pre-implementation assessment•Are the objectives well defined so that outcomes can be stated in measurable terms? •Is there a coherent and credible implementation plan that provides clear evidence of how implementation is to proceed and how successful implementation can be distinguished from poor implementation? •Is the rationale for the deployment of resources clear and commensurate with the requirements for achieving the stated outcomes?

Case Studies … use when a manager needs in-depth information to understand more clearly what happened with a policy, program, or project

Page 24: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

Logic Models and Theory of Change

Photo: World Bank

Page 25: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

ACTIVITIES

OUTPUTS

OUTCOMES

IMPACT

INPUTS

Conventional logic for achieving results

Inspired by Jeff Conklin, cognexus.org

Are we efficie

nt?

Are we effective?

Time

Source: cognexus.org

Page 26: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

ACTIVITIES

OUTPUTS

OUTCOMES

IMPACT

INPUTS

Works well for outputs

Inspired by Jeff Conklin, cognexus.org

Time

Workshops, training manuals, research and assessment reports, guidelines and action plans, strategies, and technical assistance packages, amongst others.

Source: cognexus.org

Page 27: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

ACTIVITIES

OUTPUTS

OUTCOMES

IMPACT

INPUTS

But not so well for outcomes and impact

Source: cognexus.orgTime

Multiple pathways of changes in behavior of different, interacting actors; this is

where we have the possibility of collecting and making sense of evidence that

sustains the impact we are aiming to contribute to.

Social change is long-term, complex and is the result of what many actors do (their actions and interactions)

Page 28: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

28

Logic Models and Theory of Change

• Terms often used interchangeably• Confusion by funders and grantees about

expectations• Limited knowledge on how to use• TOC and LMs can “blend” into each other

Page 29: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

29

Logic Models

• 30 year history

• Clear identification of goals (outcomes)

• First widespread attempt to depict program

components so that activities matched outcomes

Page 30: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

30

What is a logic model?

Inputs Activities Outputs

Inter-mediate

OutcomesLong-termOutcomes

Basic United Way format, 1996

Page 31: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

31

Theories of Change

• Popularized in 1990s to capture complex initiatives

• Outcomes-based

• Causal model

• Articulate underlying assumptions

Page 32: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

32

What is a Theory of Change?

Long-termOutcome

Necessary Pre-

condition

Necessary Pre-

condition

Necessary Pre-

condition

Necessary Pre-

condition

Necessary Pre-

condition

All outcomes that must

be achieved BEFORE

long-term

Explain WHY here

Show activities here also

Page 33: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

33

How are they different?

Logic models graphically illustrate program components, and creating one helps stakeholders clearly identify outcomes, inputs and activities

Theories of Change link outcomes and activities to explain HOW and WHY the desired change is expected to come about

Page 34: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

34

How are they different? (1)

Logic Models usually start with a program and illustrate its components

Theories of Change may start with a program, but are best when starting with a goal, before deciding what programmatic approaches are needed

Page 35: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

35

How are they different? (2)

Logic Models require identifying program components, so you can see at a glance if outcomes are out of sync with inputs and activities, but they don’t show WHY activities are expected to produce outcomes

Theories of Change also require justifications at each step – you have to articulate the hypothesis about why something will cause something else

Page 36: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

36

How are they different?

Summary

Logic Models Theories of Change

Representation

List of Components

Descriptive

Critical Thinking

Pathway of Change

Explanatory

Page 37: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

37

When to Use?

Logic Models when you need to:• Show someone something they can understand at a

glance • Demonstrate you have identified the basic inputs,

outputs and outcomes for your work• Summarize a complex theory into basic categories

Page 38: Workshop: Monitoring, evaluation and impact assessment

partnership Ÿ excellence Ÿ growth

38

When to Use?

Theories of Change when you need to:• Design a complex initiative and want to have a rigorous

plan for success• Evaluate appropriate outcomes at the right time and the

right sequence• Explain why an initiative worked or did not work, and

what exactly went wrong