Victoria Law Foundation Better Information Workshop How to ... · 2. Identify the intended use of...

Post on 23-Jul-2020

1 views 0 download

Transcript of Victoria Law Foundation Better Information Workshop How to ... · 2. Identify the intended use of...

Victoria Law Foundation Better Information Workshop

How to evaluate community legal education

22 September 2016

Bridget McAloon Allyson Hose

Evaluation Coordinator Community Legal Education Coordinator

PRINCIPLES

Flexible, practical and fit-for purpose

Useful and Used

Shared dialogue

Accountability

Openness and Transparency

Dr Hugh McDonald, Law and Justice Foundation

“Modest, meaningful and measurable evaluation”

Corrections Victoria Evaluation Toolkit

“There is no perfect evaluation design and no perfect evaluation. There are always limitations and constraints that may influence the comprehensiveness or accuracy of the findings. Trade-offs will often need to be made between relevance, timeliness and practicality, and the resulting quality and accuracy of the data.”

Doable Understandable Measurable Beneficial

DUMB

PLANNING AN EVALUATION

1. What is the purpose of the evaluation?

2. Who will use the information from the evaluation?

3. How is the initiative supposed to address identified needs?

4. What resources do you have to carry out the evaluation?

5. What is the timing of the evaluation?

6. What aspects of the initiative will be evaluated?

7. What are the key evaluation questions to be addressed?

8. How will we address those questions?

PURPOSE OF OUR PROJECTS

• Who is the CLE intended for?

• What is their capability?

• What is the aim of the CLE? What do you expect the target group to do with the information they receive?

• to self-help• to get help • to give help• to reinforce help

STEP 1- SCOPING THE INITIATIVE

• PROBLEM/ RATIONALE

what problem is the initiative trying to address?

• TIMEFRAME

• RESOURCES

• TARGET GROUP

• OTHER STAKEHOLDERS

PROGRAM LOGIC

A way of thinking about change and mapping actions towards a result

A map that shows relationships between:

• resources

• activities that make up project

• the sequence of events in which they will need to happen

• consequences of those activities

• changes that result

ONE WAY OF TELLING THE STORY OF A PROJECT

A LOGIC MODEL

A way of structured thinking

Based on the cause-effect relationships

Based on assumptions about

• What could happen

• The extent of control you might have over your activities

Also need to understand the context - including about the environment, the stakeholders, the conditions, the consequences of actions

Events may be sequential, or one or more events may occur at the same time

KEY THINGS TO REMEMBER

• SO WHAT??

• What CHANGE are we trying to create?

• WHY?

• What do you expect people to do with CLE?

PLANNING AN EVALUATION

1. What is the purpose of the evaluation?

2. Who will use the information from the evaluation?

3. How is the initiative supposed to address identified needs?

4. What resources do you have to carry out the evaluation?

5. What is the timing of the evaluation?

6. What aspects of the initiative will be evaluated?

7. What are the key evaluation questions to be addressed?

8. How will we address those questions?

1. PURPOSE

Examples:

• To adjust an initiative as it is being implemented;

• To expand or wrap up an initiative;

• To test a new idea or approach;

• To ascertain the best model out of a range of alternatives;

• To measure the extent to which the expected outcomes were achieved;

• To support continuous improvement, through capturing lessons learned and recommendations for any improvements.

To PROVE and IMPROVE

2. USERS AND AUDIENCE

Evaluation Audience

Role Information needs Method for sharing results

• Intended Users: Those who will make decisions using the evaluation results

• Stakeholders/ Other Audience: Those affected by, and interested in, the results of the evaluation

4. And 5. TIMING AND BUDGET

• Do you have extra $$ or people to do the evaluation?

• Is there a particular time you need to do the evaluation – i.e. for a donor or funding cycle?

• How much time to do you have you do it? Consider data collection, analysis and writing up.

• How much time has the project been running for?

6. WHAT WILL YOU EVALUATE – THE SCOPE

Depending on purpose, timing and resources, what will you prioritise to evaluate?

Example:

• Some components of the activities or services, but not all

• Some outcomes

• Specific locations

• Some of the target groups, but not all

6. WHAT WILL YOU EVALUATE – THE SCOPE

Some other key considerations:

• How much change over time is realistic?

• What changes are we actually able to influence – either directly or indirectly?

• There may be more than one outcome or result– which is/ are the most realistic to measure?

7. KEY EVALUATION QUESTIONS - KEQS

I keep six honest serving men

(they taught me all I knew);

Their names are What and Why and When

And How and Where and Who

[Rudyard Kipling – The Elephant’s Child]

7. KEY EVALUATION QUESTIONS

• KEQs - What the key intended users want to know from the evaluation

• The W’s (and H) – What, Who, When, to What extent, Why, for Whom, How?

• Effectiveness, Efficiency, Appropriateness, Sustainability

• What are the questions we want to ask to tell us if something has been completed, or if a CHANGE has occurred?

EXAMPLE KEQS

• How many people are receiving services?

• Are those receiving services the intended targets?

• Is the service having a beneficial effect on those receiving it?

• Has there been an increase in knowledge as a result of the intervention?

• How are the intended beneficiaries using the information?

• Is the problem or situation the services intended to address been made better?

• How much has it cost?

• Would another approach have yielded the same benefits at less cost?

EXAMPLE KEQS

• What factors – both enabling and challenges- affected the planning and delivery of the services?

• What worked? What didn’t?

• What unexpected outcomes or changes (positive and negative) were brought about as a result of activities?

• What is the capacity of the partners to take forward the work of the project? How has the project contributed to strengthening this capacity?

• What needs to be further put in place to sustain the benefits of the project?

8. TOOLS - HOW DO WE KNOW? HOW CAN WE KNOW?

The choice of method is dependent on:

• Existing capabilities

• Availability and accessibility of existing data

• Reasonable data collection you can build in from the start of a project

• Ethical considerations

• Time and resources – not to just do but also analyse and write-up

• The KEQ you are trying to answer.

• The access you have to the different clients/ users/ stakeholders

TIPS FOR TOOLS

TRIANGULATIONMIX OF QUANTITATIVE AND QUALITATIVE

RELEVANCEUSETIMELINESSPRACTICALITY

VALUE VOICES OF PROJECT STAFF

EXAMPLE TOOLS

• Surveys

• Interviews

• Focus Groups

• Observation/ site visits

• Standardised tests

• Document review

• Case Studies

ANY LIMITATIONS?Topic Kino Mender Kebele (1st

Kebele)Debark Woreda

Dara Kebele (2nd)Dabat Woreda

Work Demo Kebele (3rd)Wogera Woreda

Estimated number of people (m/f): approximately 50 women –including pregnant women, lactating mothers and other

women with children

Estimated number of people (m/f):approximately 80+ women –including pregnant women, lactating mothers and other

women with children1 male who came in place of his

wifeGroup also included Female RLG

memebrs

Estimated number of people (m/f):~30f

Role/ activities involved in/ training received etc.

Training included:• Identifying pregnant wome• Registering mothers and

children udner 2 with hEWs• Insisting children get

immunised• Taking pregnant mothers to

facoliltiy for 4 ANC visits• 6 moths exclusoive

breastfeeding• How to facilite group and

educate other memebrs (for leaders)

MSG leader role includes:• Educating community on

giving birth at facility, post pregnancy follow up, vaccination, sanitation

• Meeting every 2 weeks• Monitoring on • Very important to promote

quality services they will get at facility

HDA – 15 members, educate mothers every week, encouraging vaccination, encouraging mothers for EBF and IYCF. For mothers, encouraging mothers to HC, mothers do not die while give birth4+ ANC visits, encourage mothers to HF, after HC – vaccinationFollow seriously vaccine program and IYCF – encourage to deliver at health facility

What is good and what has worked? (including relevance, usefulness)

Best thing to get women to go to faciliuty is increasing awrteness and knowledge• Twice monthly MSG

meertings• Huge advantage is the face

to face of the meetings

Activities changing mothers awarenessComing every 2 weeks for activityRadio listening groups – after radio show visiting religious places and schools and porvdiung binformation and education –stopping child marriage, girls in school and befit of education, stopping harmful traditional prcatice

• Good to establish mother support group discussion in other areas

• Strengthen group discussion – how? Not to stop discussion, better to continue

[Refer below everything people have learned]

Working well

Key things to focus on in next 2 years.

EXAMPLE TOOLS

Ranking exercises

The 1 thing you are happiest with/ most proud of so far.

The 2 things you think the project should focus on for the last two years.

Tweeting exercise

Write in 140 characters the most important thing you learnt

OTHER TOOLS/ METHODS

• Reflection workshops

• ‘i’ statements

• Impact drawings

• Spider diagrams

• The ‘H’ Method

• Feedback walls

TOOLS

Key evaluation question

How can we answer? Tools/ Method

Who and when

STAKEHOLDERS TO BE CONSULTED

Stakeholder group Proposed Consultation Methods

VLA

Aboriginal and Torres Strait Islander staff

• AD Aboriginal Services focus group• Survey• Focus group in final evaluation

RAP steering committee • Focus group in final evaluation• A selection of members attend MTR workshop

VAL staff responsible for delivering RAP activities

• Online survey

SET members • A selection of members attend MTR workshop• Focus Group/ Interviews in final evaluation• Online survey

Manager CLE • MTR workshop• Targeted Interview

Manager Legal Help • Interview in final evaluation

Regional Managers (particularly Morwell/ Bairnsdale, Shepparton, Bendigo)

• Online survey • Focus Group Discussions/ Interviews in final

evaluation• A selection involved in MTR workshop

Program Managers and Policy staff

• Survey in final evaluation• A selection for Focus Group Discussions in final

evaluation

VLA staff • Online survey• A selection for Focus Group Discussions in final

evaluation

EVALUATION TIPS

1. Be clear about what the initiative is trying to achieve

2. Identify the intended use of the evaluation and its purpose

3. Limit the number of evaluation questions (KEQs)

4. Choose your method based on your KEQ – not the other way around

5. Keep it practical, and relevant, and based on your existing resources and capability

NEXT STEPS

• DOING

• ANALYSING RESULTS

• REPORTING RESULTS

• USING RESULTS

JUST GET STARTED – BUILDING A CULTURE

• Evaluative thinking is a way of doing business

• Most meaningful when it is embedded in an organisation’s culture

• Characterises a learning and innovating organisation

• Critical thinking and reflection are valued and reinforced

• Capacity to generate/ capture and use knowledge – in the form of formal/ informal evaluation results, lessons learned, data, case studies, practice experience etc.

EVALUATION AND EVALUATIVE THINKING TAKE PRACTICE

SOME IDEAS TOWARDS BUILDING THE CULTURE

• Acknowledge the informal evaluation that staff already do

• Involve staff to increase engagement and ownership

• Look for small successes

• Position evaluation as a way of giving staff a voice

• Be clear about who the evaluation is for

• Schedule time up front and build into planning (and resourcing) phase

• Decrease the use of jargon

MODEL EVALUATION AT EVERY OPPORTUNITY

An evaluation culture is about having good quality knowledge in the hands of our staff and our partners - that is used to develop and implement effective projects and legal services

Building a knowledge and evidence base – that reflects your organization’s size, voice, and strategic role in building access to justice

Evaluating our phone app Below the belt

About the app

Below the belt: sex, selfies, cyberbullying was a free Android phone app launched in November 2013 that had information about laws on sex and consent, sexting and cyberbullying.

The app contained fun, interactive tools like an age of consent calculator, and colourful e-postcards, as well as legal information specific to each state and territory.

Early focus testing of the concept showed young people liked it.

Initial enthusiasm for the project included free promotional support from entertainment and advertising companies. For a while it all seemed to be going well …

So what went wrong?

• Within six months, it was clear that the install rates were low and the uninstall rates were high

• At 12 months from launch 1095 people had installed the app but 849 users uninstalled

• The app was relatively cost-inefficient, with a cost per install of $42, which is significantly higher than other educational tools such as publications or web-pages

Why we needed more evaluation

• We needed to know more than just the figures

• We needed to test our own assumptions about the project

• We wanted to understand more fully what went wrong and share our knowledge with other organisations considering apps and new technology projects

What we found

• The project concept was not adequately tested

• We did not consider the marketing model

• The app became unusable due to fragmentation

Things we’d do differently next time

• Determine the value proposition for the client

• Consider marketing strategies

• Technical experience

The value of evaluation

• Published on VLA website

• Presented at the National Association of Community Legal Centres (NACLC) conference

• Picked up in a report by Roger Smith, visiting professor at London South Bank University writing for the Law, Technology and Access to Justice website

“Those involved in access to justice have limited budgets. We tend to have one shot at success in a way that rarely limits commercial operations. So, the sharing of evaluations … is really valuable both for the organisations involved and for a wider audience.

And at the heart of the evaluation are the numbers. Every nerve of every experienced project pitcher will scream at the prospect of publicly proclaiming detailed targets and performance against them. But, it is really essential.

Under all the guff, how did you really do?”