Evaluation Café: A Study of Three Developmental Evaluations · 2020-03-11 · 1 Evaluation Café:...

Post on 03-Jul-2020

0 views 0 download

Transcript of Evaluation Café: A Study of Three Developmental Evaluations · 2020-03-11 · 1 Evaluation Café:...

1

Evaluation Café:

A Study of Three

Developmental Evaluations

This presentation is made possible by the support of the American People through the United States Agency for International Development (USAID.) The contents of this presentation are the sole responsibility of DEPA-MERL Consortium and do not necessarily reflect the views of USAID or the United States Government.

March 2020

2

Who we are

3

Before we begin,

• How familiar are you with Developmental Evaluation (DE)?

• How familiar are you with Outcome Harvesting?

• How many are interested in international evaluation work?

4

• Introduction to DEPA-MERL

• Overview of research questions & methods

• Key lessons learned

• Audience questions

Our agenda today,

5

• WDI working with USAID since 2015

• MERLIN Program:

– Innovate on traditional

approaches to monitoring,

evaluation, research and learning

(MERL)

• DEPA-MERL is a consortium under

MERLIN that implements

developmental evaluations in the

USAID context

Introduction to DEPA-MERL

6

• Supports the continuous

adaptation of development

interventions in complex

environments

• Developmental Evaluators are

“embedded” within the program

– Use a variety of M&E

approaches

What is developmental

evaluation?

7

How we applied DE

• Three DE pilots were conducted

– Family Care First in Cambodia

– Sustained Uptake DE, the US Global Development Lab

(Washington, DC)

– Bureau for Food Security (Washington, DC)

• Each DE pilot included:

– 1 full-time, external Developmental Evaluator

– A “DE Administrator” from DEPA-MERL

8

Our research questions

INSERT DE RQ QUESTIONS TABLE

FROM IP AAP REPORT

Research question Methods Data sources

1: How does DE capture, promote,

and enable the utilization of emergent

learnings in support of ongoing

programming in a complex system, in

the USAID context?

Outcome

harvesting

(qualitative)

● Monthly reflection interviews with two Developmental

Evaluators (n=35)

● Substantiation interviews with keys stakeholders at

endline of two DEs (n=26)

● Document reviews, as required

2: What are the barriers and enablers

to implementation of DE in the USAID

context?

Semi-

structured

interviews

(qualitative)

● Monthly reflection interviews with two Developmental

Evaluators (n=35)

● Substantiation interviews with keys stakeholders at

endline of two DEs (n=26)

3: What is the perceived value of

conducting a DE, especially versus a

traditional evaluation approach?

Survey

(quantitative

and

qualitative)

● Value of Developmental Evaluation Survey with DE

stakeholders at endline (n=30)

9

Outcome harvesting approach

10

What was the most

surprising finding from our

research on DE?

Be

n W

hit

e

11

What is the biggest difference

between DE in theory and

DE in practice?

12

If you were approached by

new DE implementer, what

advice would you give them

about how to successfully

launch and oversee a DE?

https://bit.ly/32kJGOA

13

What is the most

controversial idea or topic in

DE right now?

Die

go

PH

14

Resources!

https://bit.ly/2JXCUru https://bit.ly/2VkTbgO

15

Questions?

16

USAID-MERLIN (DEPA-MERL)

Shannon Griswold, sgriswold@usaid.gov

WDI

WDI-performancemeasurement@umich.edu

Contact us!

WDI Website

https://bit.ly/2NtuVU7

17

Annex

18

What is developmental evaluation?

• Development Evaluation (DE) is an approach to evaluation that supports the

continuous adaptation of development interventions.

• DE provides evaluative thinking and timely feedback to inform ongoing adaptation as

needs, findings, and insights emerge in complex dynamic situations.

• The DE helps facilitate the process from findings to action in a collaborative process

with the DE stakeholders.

RealizedOutcomes

Intended Outcomes

EmergentOutcomes

UnrealizedOutcomes

ImplementedOutcomes

Source: Henry Mintzberg, Sumatra Ghoshal, and James B. Quinn, The Strategy Process, Prentice Hall, 1998.

What outcomes can you expect from DE?

20

Traditional Evaluation Developmental Evaluation

Purpose Supports improvement,

summative test and

accountability

Supports development of innovation and

adaptation in dynamic environments

Standards Methodological competence

and commitment to rigor,

independence, credibility with

external authorities

Methodological flexibility and

adaptability; systems thinking, creative

and critical thinking balanced; high

tolerance for ambiguity; able to facilitate

rigorous evidence-based perspectives

Options Traditional research and

disciplinary standards of quality

dominate options

Utilization focused: options are chosen in

service to developmental use

Evaluation

Results

Detailed formal reports;

validated best practices,

generalizable across time and

space.

Rapid, real time feedback; diverse, user-

friendly forms of feedback. Evaluation

aims to nurture learning.

How is DE different?

• The DE evaluator works

collaboratively with implementing

teams to conceptualize, design, and

test new approaches in a long-term,

ongoing process of adaptation,

intentional change, and

development.

• The DE evaluator thinks and

engages evaluatively; questions

assumptions; applies evaluation

logic; uses appropriate methods; and

stays empirically grounded— that is,

rigorously gathers, interprets, and

reports data.21

What is a developmental evaluator?

22

Analysis Categories for the Harvested Outcomes (RQ1)

Factor Percent of all enablers* Percent of all barriers*

Skills of the Developmental Evaluator 17% 8%

Data collection and sharing 16% 10%

Data utilization 12% 8%

Integration of the Developmental Evaluator 11% 10%

Leadership (of program being evaluated) 11% 15%

Stakeholder relationships 9% 11%

DE readiness 8% 8%

USAID dynamics 7% 14%

Funding dynamics 2% 6%

Local and international dynamics 2% 4%

* Percentages do not total 100% because only 10 of the 13 factors that were coded across DE pilots for are shown.

23

Barriers and Enablers (RQ2)

24

Value of Developmental Evaluation Survey (RQ3) (1/2)

Takeaway:

In all areas except

DE cost and time

savings, survey

respondents said

the DEs were

better than

traditional

evaluation (n=29)

25

Value of Developmental Evaluation Survey (RQ3) (2/2)

Takeaway: The overwhelming majority of respondents reported positive interactions with

their Developmental Evaluators.

26

Description of codes (1/2)

27

Description of codes (2/2)

28

The DEPA-MERL DE pilots