Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British...

26
Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia
  • date post

    18-Dec-2015
  • Category

    Documents

  • view

    216
  • download

    1

Transcript of Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British...

Page 1: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Evaluators’ Roles and Role Expansion

Canadian Evaluation Society

June 2, 2003

Vancouver, British Columbia

Page 2: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.
Page 3: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.
Page 4: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Overview

Understanding the Expanding Role of Evaluators– Touchstones– Dilemmas

Complexity and Role Expansion from a Foundation Perspective– Context/Philosophy– Challenges

Evolution and Role Expansion from a Consultant Perspective– Strategy– Reciprocity

Evaluation Standards as Guideposts Dialogue and Discussion

Page 5: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

“Traditional” Role of Evaluators

Evaluation is defined as a process to determine evaluand’s merit or worth

With this definition and conceptual emphasis on evaluation’s objectivity, the role of evaluator is portrayed as a judge

In practice, evaluators tried to distance themselves as non-intrusive observer, quiet note-taker, non-emotional (thus “objective”) presenter

Page 6: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

This & That about Evaluation

Types of Evaluation– Context Evaluation: What has been done before? What has

happened? Who made it and who didn’t?– Process Evaluation: What happened and how?– Outcome and Impact Evaluation: What worked and didn’t work?

Why and why not?– Lessons Learned: unintended outcomes, social learning

Types of Data– Qualitative Data: to answer the how, why, in what way, what,

where, who type of evaluation questions– Quantitative Data: to answer the how many, how much, to what

extent type of questions– Secondary “Census”/Sector Data

Page 7: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

How is Evaluator’s Role Expanded

Context Evaluation evaluator’s involvement in program planning as information feeder, could be also viewed by some as “expert” in the field because information is POWER

Process Evaluation evaluator’s involvement in program implementation as messenger, technical assistance provider, in some cases, facilitator (of discussions)

Page 8: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

How is Evaluator’s Role Expanded (continued)

Outcome Evaluation evaluator’s involvement in determining the program’s “fate” as potential decision maker, as program’s advocate, or worse as program’s enemy

Lessons Learned evaluators’ involvement in disseminating or sharing of program products and engagement in learning process with program staff

Page 9: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

More about Evaluators’ Role Expansion

Empowerment evaluation approach, participatory evaluation approach becoming more and more popular in practice

evaluation becomes part of the intervention

evaluators engaging with program staff and participants in evaluation design, data collection, analysis, report writing, etc.

evaluators as trainer, monitor, technical assistance provider, coach, etc.

Page 10: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

A Foundation Perspective

Funders seeking better outcome data– Learning from / improving grant-making practices – Accountability

“Strategic Philanthropy” movement– More targeted outcomes– Charity v. systems change– At extreme: ROI

Page 11: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Foundation (2)

Tracking / following clients across service delivery systems (e.g., courts / foster care; preschool / school) as element of change strategy– Technical assistance on data-collection for

individual projects– Cluster / Initiative evaluator is asked to fill this

role

Page 12: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Strategic Philanthropy

Challenges for Evaluation– Multiple stakeholders– Evaluation as Intervention– Evaluator as Interpreter– Evaluating sustainability

Page 13: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Challenge: Multiple Stakeholders

Formative and summative initiative evaluation with different stakeholders

– Board: How do we know we invested wisely?– Program staff: How can we do better?– Grantees: How can we learn from each other? Are we doing

OK?

Requires initiative evaluator to address multiple needs – more sophisticated evaluation designs and personal relationships

Page 14: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Challenge: Evaluation as Intervention

Part of change strategy may be collecting data across systems (e.g., foster care / adoption; pre-school / public education)

Projects (sites) encouraged to use data for local decision making

Technical assistance on local evaluation may be key part of change strategy

Initiative evaluator in both evaluator and intervention roles

– Requires clarity on potential conflict of interest

Page 15: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Challenge: Evaluator as Interpreter

In TA role, may interpret funder’s intent to grantees– What data are “good enough”?– How can our community meet local needs and contribute

data to the overall evaluation?

In evaluator role, need to interpret local outcomes in context of overall strategic intent

– Are local outcomes measuring the right things?

Requires different skills – maybe multiple evaluators?

Page 16: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Challenge: Evaluating Sustainability

Sustainability has many definitions– Specific program– The organization– Networks / partnerships

WKKF emphasizes sustaining capability to address local concerns

– Creating “adaptive systems”

Need for better ways of assessing adaptability of community systems

Page 17: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Implications for Initiative Evaluations

Think of evaluation differently– Evaluating the foundation, not (only) grantees

Was our systems change theory supported?

– More directive about project-level evaluation

Develop skills in systems analysis

Page 18: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Evolution and Role Expansion from a Consultant Perspective

Strategy/Design Reciprocity Epistemology Connecting “what you do” with “what you

get…” seeing the “blind spots”

Page 19: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Blind Spot Demonstration

Page 20: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Logic Models Shape Strategy

Role as catalyst and learning coach BUT Logic Models are very subjective

– Perception– Persuasion– Politics

Who develops them matters Don’t assume it is “gospel” or the “truth” Consider an external design review panel

Page 21: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Reciprocity Increases Risk

Organizational vs Foundation Effectiveness Evaluation role influences both sides of

philanthropy The press for accountability may limit

innovation and experimentation Demand for outcomes without attention to

quality and timing may lead to greater escalation and hyperbole

Page 22: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Ways of Knowing Shape Questions and Answers

Page 23: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Suggested Guidelines – Joint Committee Standards

Conflict of Interest

– Identify and clearly describe possible sources

– Agree in writing on procedures

– Seek advice

– Release evaluation procedures, data, reports publicly, when appropriate

– Obtain evaluation contract from funders, whenever possible

– Assess situations

– Make internal evaluators directly responsible to agency heads

– Metaevaluations

Page 24: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Suggested Guidelines – Joint Committee Standards

Metaevaluation– Budget sufficient money and other resources– Responsibility assignment– Have chair nominated by a respected professional

body– Determine and record rules– Discretion of the chair person– Final authority for editing report– Determine and record report audience

Page 25: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Suggested Guidelines – Joint Committee Standards

Evaluator Credibility– Stay abreast of social and political forces associated with

the evaluation– Ensure that both work plan and composition of evaluation

team are responsive to key stakeholders’ concerns– Consider having evaluation plan reviewed and evaluation

work audited by another evaluator whose credentials are acceptable to the client

– Be clear in describing evaluation plan– Determine key audience needs for information– State evaluator’s qualifications relevant to program being

evaluated

Page 26: Evaluators’ Roles and Role Expansion Canadian Evaluation Society June 2, 2003 Vancouver, British Columbia.

Suggested Guidelines – Joint Committee Standards

Political Viability– Evaluation should be planned and conducted with

anticipation of the different positions of various interest groups, so that their cooperation may be obtained, and so that possible attempts by any of these groups to curtail evaluation operations or to bias or misapply the results can be averted or counteracted

Impartial Reporting– Reporting procedures should guard against distortion

caused by personal feelings and biases of any party to the evaluation, so that evaluation reports fairly reflect the evaluation findings