Evelyn Gonzalez

44
Evelyn Gonzalez Program Evaluation

description

Evelyn Gonzalez. Program Evaluation. Making a difference. AR Cancer Coalition Summit XIV March 12, 2013. Evaluating Programmatic Efforts. Objectives. Overview of evaluation Defining SMART objectives for your goals Know how to use different methods of evaluation - PowerPoint PPT Presentation

Transcript of Evelyn Gonzalez

Page 1: Evelyn Gonzalez

Evelyn GonzalezProgram Evaluation

Page 2: Evelyn Gonzalez

AR Cancer Coalition Summit XIVMarch 12, 2013

MAKING A DIFFERENCEEvaluating Programmatic Efforts

Page 3: Evelyn Gonzalez

Overview of evaluationDefining SMART objectives for your

goalsKnow how to use different methods of

evaluation Be more willing to evaluate your

efforts

OBJECTIVES

Page 4: Evelyn Gonzalez

…the systemati c collection of information about the activities, characteristics, and outcomes of programs to make judgments about the program, improve program eff ectiveness, and/or inform decisions about future programming.

(Patt on, Uti lizati on Focused Evaluati on, 1997)

WHAT IS PROGRAM EVALUATION?

Page 5: Evelyn Gonzalez

Did the program/intervention

work?

Was it worth it?

What worked; what didn’t?

Who did we reached?

Did we get our monies worth?

WHY EVALUATE

Page 6: Evelyn Gonzalez

WHEN SHOULD WE BEGIN EVALUATION?

Page 7: Evelyn Gonzalez

An evaluation plan is the “blueprint”

What will be evaluated What information will be

collected When will it be collected

What will be done with the results

EVALUATION PLAN

Page 8: Evelyn Gonzalez

CDC FRAMEWORK FOR EVALUATION

Page 9: Evelyn Gonzalez

• Collect

• Analyze Data

START

• Implementation of the program

• Gather data as you go

• Monitor

Planning

• Establish Goals & Objectives

• Establish baseline •Identify an Evidence- Base Program (EBP)

Evaluation

• As you implement

• End of program

Community/Audience

Stakeholders

Planning Phase

Implementation

Phase

Evaluation

Phase

Involve Stakeholders

Share results with Community & Stakeholders

Page 10: Evelyn Gonzalez

The “grand reason” for engaging in your public health effort

Span 3 or more years

State the desired end result of the program.

GOALS: DEFINITION

Page 11: Evelyn Gonzalez

More specific than goals.

They state how the goals will be achieved in a certain timeframe.

Well written objectives are SMART:

Specific

Measurable

Achievable

Realistic and Relevant

Time-framed

OBJECTIVES: DEFINITION

Page 12: Evelyn Gonzalez

Specific

Who are you reaching (priority audience)?

What intervention will you use?

Where, setting

S.M.A.R.T.

Page 13: Evelyn Gonzalez

Measurable

Dosing, how many times will you do the intervention

What is the expected outcomeIncrease of X% following the intervention

Decrease of smoking by X%

S.M.A.R.T.

Page 14: Evelyn Gonzalez

AttainableIs your intervention feasible?

Realistic and RelevantDoes the objective match the goal?

Is it evidence-based program (EBP)?

S.M.A.R.T.

Page 15: Evelyn Gonzalez

Time-framedBy when do you anticipate the change?End of the session

3,6,9 months

5 years

S.M.A.R.T.

Page 16: Evelyn Gonzalez

You are working on an intervention that will increase awareness about breast cancer risk

Objective 1: Participants will be aware of the major risk factors for developing breast cancer.

How can this be re-written to be SMART?

SMART OBJECTIVE EXERCISE

Page 17: Evelyn Gonzalez

Original:Participants will be aware of the major risk factors for developing breast cancer.

SMART Objective:Upon post test following the intervention, participants will be able to identify 3 major risk factors for developing breast cancer.

SMART OBJECTIVE EXERCISE

Page 18: Evelyn Gonzalez

Original:This program will increase screening for colorectal cancer in Arkansas.

SMART:Colorectal screening will be increased by 5%, over the prior year for age appropriate males in Arkansas.

RE-WRITTEN:

Page 19: Evelyn Gonzalez

Objective 1: Public Education for Breast Cancer Screening –

Increase knowledge and improve attitudes of all women withregards to the importance of breast cancer screening

Strategy 1 – Promote campaigns to educate the public about the importance of mammography.

Action 1 – Increase awareness among all women 40 and older of the importance of regular breast cancer screening

GOAL: PROMOTE AND INCREASE THE APPROPRIATE UTILIZATION

OF HIGH-QUALITY BREAST CANCER SCREENING

Page 20: Evelyn Gonzalez

Planning—Develop the questions, consult with the program stakeholders or resources, make a timeline

Data Collection—Pilot testing. How will the questions be asked? Who will ask them?

Data Analysis—Who will analyze the data and how?

Reporting—Who will report and how? Who will receive the data and when? How will it affect the program

Application—How could your results be applied in other places?

THE EVALUATION PROCEDURE

Page 21: Evelyn Gonzalez

Look at the evaluation methods used in the original EBP.

When discussing evaluation, think about these questions:What is important to know?What do you need to know versus what is nice to know?

What will be measured and how?How will this information be used?

PLANNING FOR EVALUATION

Page 22: Evelyn Gonzalez
Page 23: Evelyn Gonzalez

Indicators or measures are the observable and measurable data that are used to track a program’s progress in achieving its goals.

Monitoring (program or outcome monitoring, for example) refers to on-going measurement activity

SOME DEFINITIONS…

Page 24: Evelyn Gonzalez

Process evaluation can find problems early on in the program. It includes an assessment of the staff, budget review, and how well the program is doing overall.

For this kind of evaluation, it may be useful to keep a log sheet to record each of your activities.

From Windsor et al., 1994

PROCESS EVALUATION

Page 25: Evelyn Gonzalez

Impact evaluation can tell if the program has a short-term effect on the behavior, knowledge, and attitudes of your population.

It also measures the extent to which you have met your objectives.

From Green and Kreuter, 1991

IMPACT EVALUATION

Page 26: Evelyn Gonzalez

Outcome evaluation looks to see if the long-term program goals were met.

These goals could be changes in rates of illness or death, as well as in the health status of your population.

From McKenzie & Smeltzer, 1997

OUTCOME EVALUATION

Page 27: Evelyn Gonzalez

Identify Program Goals

For each goal: Identify Process Objectives

Identify Outcome Objectives

For each objective: Identify Indicators

Identify Data Source

Plan Data Collection

Plan Data Analysis

APPLICATION TO YOUR PROGRAM:

Page 28: Evelyn Gonzalez

DATA COLLECTION METHODS

SurveysInterviewsFocus GroupsObservationDocument Review

Page 29: Evelyn Gonzalez

You may develop a way to compare the baseline data from the needs assessment with the final outcome of your program.

Pre/Post survey in an education session.

This will let you see if you have achieved your objectives.

PRE- AND POST-EVALUATION

Page 30: Evelyn Gonzalez

Primary sources Quantitative:

Surveys/questionnaires Qualitative: Focus groups, public

meetings, direct observation Qualitative: In-depth interviews

with community leaders, interviews with other program planners.

INFORMATION COLLECTION

Page 31: Evelyn Gonzalez

Will depend on which EBP/Intervention selected

Answer these questions: What specific behaviors do I want my audience to acquire or enhance?

What information or skills do they need to learn to act in a new way?

What resources do I need to carry out the program?

What methods would best help me meet my objectives?

STRATEGIES

Page 32: Evelyn Gonzalez

USING MIXED DATA SOURCES/METHODS

Involves using more than one data source and/or data collection method.

Page 33: Evelyn Gonzalez

Your objectives should be measurable so that they can be evaluated.

The evaluation should be in line with your objectives.

Try not to make up new things to evaluate.

PROGRAM OBJECTIVES AND EVALUATION

Page 34: Evelyn Gonzalez
Page 35: Evelyn Gonzalez

You may want to do a pilot test in order to evaluate the effect of your program.

A pilot test is a practice run using a small group who are similar to your target audience.

PILOT TESTING

Page 36: Evelyn Gonzalez

Evidence-based programs have already done some type of evaluation.

Look to see how the program was evaluated before. Try to use the same methods.

You do not have to evaluate everything!

REPLICATING THE EVALUATION

Page 37: Evelyn Gonzalez

MONITORING PROGRESS

Page 38: Evelyn Gonzalez

NOW THAT YOU’VE COLLECTED THE DATA, WHAT DO YOU DO WITH IT?

Analyzing data Who When How

Interpretation of results and sharing findings

Page 39: Evelyn Gonzalez

Must be able to answer this!

Do not just look for the good outcomes

Learn from what didn’t work

Share both the positive and negative outcomes

SO WHAT?

Page 40: Evelyn Gonzalez

DEVELOPING RECOMMENDATIONS

Your evaluation’s recommendations should be:

Linked with the original goals/SMART objectives.

Based on answers to your evaluation questions.

Should have stakeholder input

Tailored to the end users of the evaluation results to increase ownership and motivation to act.

Page 41: Evelyn Gonzalez

SHARING RECOMMENDATIONS

CommunityExecutive SummaryFinal ReportNewsletter article(s)Website articleTown hall

meeting(s)Radio interviewsLocal newspapers

Institution & YourselfExecutive SummaryFinal ReportJournal articlesProfessional

conferencesPoster sessionsMeetings with

colleagues

Page 42: Evelyn Gonzalez

TIPS & CONSIDERATIONS

Consult with partners with evaluation experience

Budget 10-15% for evaluation Staffing Build a database Analysis

Consider pilot testing your programPilot test your evaluation method &

tool(s)

Page 43: Evelyn Gonzalez

Trust yourself. You know more than you think you

do!

Benjamin Spock

Page 44: Evelyn Gonzalez