Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research...

36
latrobe.edu.au CRICOS Provider 00115M CRICOS Provider 00115M . Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University

Transcript of Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research...

Page 1: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

latrobe.edu.au

CRICOS Provider 00115M

CRICOS Provider 00115M

.Doing Program Evaluation

Sue DysonThe Australian Research Centre in Sex, Health & Society, La Trobe University

Page 2: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

2La Trobe University

AimsThis workshop will:• Provide information about evaluation

approaches.• Provide opportunities to think about

approaches to evaluating PVAW programs

Page 3: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

3La Trobe University

What is Evaluation?• Evaluation is the process by which we judge the

worth or value of something (Suchman, 1967).

• …a critical component of every effective program…one step of an ongoing process of planning, implementation and review (PADV, 2000).

• …a continuous process of asking questions, reflecting on the answers and reviewing your ongoing strategy and action (National Mental Health Promotion and

Prevention Working Party, 2001).

• Evaluation ideally starts at the beginning!

Page 4: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

4La Trobe University

Why evaluate?

• Increased external demands for accountability

• To understand what works and what does not work in a program, and why.

• To continuously improve • To learn from mistakes and build on

strengths.• To inform future planning.

Page 5: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

5La Trobe University

Evaluation and social programs

• Takes place within a political and organizational context.

• Requires sensitivity to multiple stakeholders and awareness of political context.

• Influenced by external factors which cannot always be accounted for or measured ‘objectively’.

Page 6: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

6La Trobe University

Ethical Evaluation• Trust, mutual responsibility and equality.

• Those being researched are ‘participants’ not ‘subjects’.

• Should not impose a burden on participants and ideally offer some benefits.

• Based on informed consent not coercion or manipulation.

• Should respect and protect privacy and anonymity.

Page 7: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

7La Trobe University

Principles of ethical research

• Research should have merit• Be fair and just• Must do no harm or cause discomfort to the

participants or the wider community.

National Health and Medical Research Guidelines: http://www.nhmrc.gov.au/publications/synopses/_files/e72.pdf

Page 8: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

8La Trobe University

Stages of Evaluation • Formative: Aims to understanding the

development and implementation of a project. • Summative: Aims to understand what is

happening as the work proceeds. Focus on processes

• Outcomes: what has been achieved at the end of the project?

• Impact: Have outcomes been sustained over time?

Page 9: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

Evaluation starts at the beginning

Page 10: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

10La Trobe University

Goals

• A goal is a short, concise, general statement of what you are setting out to do.

• It is usually not phrased in quantified terms. • It should point towards some long term effect,

change, or purpose. • A good program goal has strategic force to it.

Page 11: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

11La Trobe University

Examples of goals• To eliminate sexism and promote equal

respectful relationships in a community football. • To promote gender equity and respectful

relationships in a school community using a whole school approach.

• To eliminate gender inequity and drive cultural change across partner agencies to redress the determinants of violence against women.

Page 12: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

12La Trobe University

Objectives

• An objective is specific, action oriented, and quantifiable statement of what will be achieved

• It is a statement of outcomes which can be used to determine progress towards the stated goal.

• There should be at least one objective for each component of your program.

Page 13: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

13La Trobe University

• The objectives should, taken together, have the effect of achieving the overall program goal.

• There should be cohesion among the objectives. • An objective which is significantly different from

the others needs to be evaluated to determine whether it belongs as a part of the proposed program.

Page 14: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

14La Trobe University

Examples of objectives

By the end of this project participants will:• Understand sexism and its role in gender based

violence. (how will you know?)• Demonstrate respectful attitudes towards each

other (How will you know?)• Be more inclusive and respectful of diversity

(How will you know?)

Page 15: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

15La Trobe University

Tips for defining a set of objectives

• Make a list of things that must be done to achieve your program goals.

• Rewrite each item on the list in the form of a result that can be measured, described or understood.

• Review all of your proposed objectives and adjust them to achieve an appropriate balance between them.

Page 16: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

16La Trobe University

Indicators and Measures

• Indicators describe how you will know you have achieved an objective.

• Measures can specify numbers or percentages, or

• Describe or demonstrate how you know/understand the change specified in your objective/s.

• To measure or describe you collect data.

Page 17: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

17La Trobe University

Performance measures• Data can be qualitative or quantitative (count or

describe).• They might measure project outputs e.g.

numbers/ demographics/ sessions; quantify products like publications (how much did we do?) or measure effect i.e. changes in knowledge, attitudes or behaviours.

• Measures can also describe changes, draw on emergent themes and tell stories.

Page 18: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

18La Trobe University

Qualitative or Quantitative?• Quantitative: should pass a test for reliability,

feasibility and utility of decision making. Usually based on a large enough sample to be

able to generalize.

• Qualitative (proxy indicators): a way of understanding in the absence of metrics. More subjective Should be achievable and realistic Enable assessment of change over time.

Page 19: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

19La Trobe University

Quantitative evaluation data• Surveys, Pre and post tests (with closed

questions), Feedback sheets (with closed questions)

• Observation checklists.• Advantages: easy to administer, yields large

numbers, easier to summarise and more widely accepted as ‘evidence’.

• Disadvantages: doesn’t tell the whole story, requires sophisticated analysis

Page 20: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

20La Trobe University

Qualitative Evaluation Data• Descriptive, narrative, tells the story behind the

numbers.

• Includes interviews, focus groups, questionnaires with open ended questions, journals, observation field notes

• Advantages: greater depth & insight into the why and how of attitudes and behaviours, provides context, nuance and detail.

• Disadvantages: more subjective, time consuming, smaller samples

Page 21: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

21La Trobe University

Mixed Methods• Draws on both qual and quant methods • Advantages: provide metrics and measures as

well as stories about what happened. Can be done collaboratively so that insider knowledge is combined with external expertise.

• Disadvantages: requires skill to collect and analyse data, possibility of over-reaching and ending up with too much data if you go it alone.

Page 22: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

22La Trobe University

It’s about more than metrics!

• Effective program evaluation does more than focus on numbers. It makes it possible to:• Keep stakeholders informed

• Learn from mistakes and continually improve

• Tell the story behind the project and what was learned along the way.

• Be accountable.

Page 23: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

23La Trobe University

SMART Indicators (quantitative)

SPICED Indicators (qualitative)

• Specific to the change to be measured

• Measurable and unambiguous

• Attainable and sensitive• Relevant and easy to

collect• Time bound with term

dates for measurement

• Subjective• Participatory• Interpreted &

communicable• Cross checked• Empowering• Diverse and disaggregated

Page 24: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

24La Trobe University

Proxy Indicators

• Individual level: how do we know there have been changes?

• Community level: how do we know there have been changes?

• Societal level: how do we know there have been changes?

Page 25: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

Building a Logic Model

Page 26: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

26La Trobe University

Logic Models• A conceptual map for a program, project or

intervention• A framework for action• A program theory, a theory of change• A way of graphically depicting what you want

to achieve and how you will do it. • Should provide logic and clarity by presenting

the big picture of planned changes along with the details of how you will get there.

Page 27: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

27La Trobe University

Theory of change

• Building blocks required to bring about a given long-term goal

• Specific and measurable description of a social change initiative that forms the basis for strategic planning

• Should be clear on long-term goals, formulate actions to achieve goals (objectives), and identify outcome indicators.

Page 28: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

28La Trobe University

Key points

• Clarify language and use it consistently• Start with goals• Link intended activities and outcomes with

goals• Be prepared for unexpected effects• There is no ‘correct’ way of depicting a logic

model

Page 29: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

29La Trobe University

Goal:Objective Action/s Indicators Measures Data

Page 30: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

30La Trobe University

Page 31: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

31La Trobe University

Program or intervention goal

Inputs/resources Outputs/ Activities Outcomes/impact

Assumptions Context or conditions external factors

Staff

Volunteers

Time

Money

Evidence base

Equipment

Technology

partners

What we do:

Workshops, meetings,Services,Products,Training ,Resources,Facilitate,Partner,

Who we reach:

Clients,Agencies,Decision makers,Service users,Etc.

ShortLearning

Knowledge

Attitudes

Skills

Awareness

Opinions

Aspirations

Med.Behaviour,

Practices,

Decision- making,

Policies,

Social actions.

Sustained knowledge

LongSustained behaviour change

Social/economic/civic conditions

Cultural change

Page 32: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

32La Trobe University

When to use a logic model• From the beginning: it provides you with a

road map for your project. • Must be a work in progress: a continuous

process of intentional review and revision.• Objectives can change, expect unintended

outcomes.• In reports it is acceptable to discuss what

changes occurred, and explain unexpected outcomes

Page 33: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

33La Trobe University

Limitations of logic modelling

• Can be time consuming and do not account for unintended consequences

• Can become onerous – needs a balance between complexity and over simplification

• No guarantee of logic (or of success), must be plausible & feasible

Page 34: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

34La Trobe University

Benefits of a logic models

• They integrate planning, implementation and evaluation

• Prevent mismatches between activities and effects

• Leverage the power of partnerships• Enhance accountability

Page 35: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

35La Trobe University

Evaluation Resources

Community Toolbox: http://ctb.ku.edu/en/table-of-contents/overview/model-for-community-change-and-improvement

Better Evaluation: http://betterevaluation.org/

Page 36: Latrobe.edu.au CRICOS Provider 00115M. Doing Program Evaluation Sue Dyson The Australian Research Centre in Sex, Health & Society, La Trobe University.

36La Trobe University

Task• Working in a group start to develop a logic

model to plan for program implementation and evaluation.

• What do you want to do?• How will you do it?• How will you know you have been successful?• How will you know?