Susan G. Komen Continuum of Care Evaluation Course ... · Susan G. Komen Continuum of Care...

Post on 30-Jul-2020

0 views 0 download

Transcript of Susan G. Komen Continuum of Care Evaluation Course ... · Susan G. Komen Continuum of Care...

Susan G. Komen Continuum of Care Susan G. Komen Continuum of Care Evaluation Course Presentation Evaluation Course Presentation

Janet A. Phoenix, MD, MPHJanet A. Phoenix, MD, MPH Senior ConsultantSenior Consultant

October 13 and 20, 2011October 13 and 20, 2011

The Grant Group LLCThe Grant Group LLC

. . 342 Orchard St. NW Suite 201, Vienna, VA 22180342 Orchard St. NW Suite 201, Vienna, VA 22180--41414141..

Corporate HQ Tel: 301.325.8850 Corporate HQ Tel: 301.325.8850 ●●

www.thegrantgroupwww.thegrantgroup--llc.comllc.com 1

Evaluation OverviewEvaluation Overview

Definition: Definition:

Program evaluation is carefully collecting Program evaluation is carefully collecting information about a program or some information about a program or some aspect of a program in order to make aspect of a program in order to make necessary decisions about the program. necessary decisions about the program.

2

•Needs Assessment•Cost Benefit Analysis•Formative•Summative

3

Examples

Evaluation GoalsEvaluation Goals

Understand and increase the impact of products Understand and increase the impact of products or servicesor services

Improve delivery mechanisms to be more Improve delivery mechanisms to be more efficient and less costlyefficient and less costly

Verify that the program is achieving intended Verify that the program is achieving intended goals and objectivesgoals and objectives

4

Essential ElementsEssential Elements

FundingFundingConsumer ResponseConsumer ResponseUnmet needUnmet needImpact on design of future Impact on design of future

programs/activitiesprograms/activitiesCapacityCapacity

5

6

Considerations

•Purposes•Audience•Information needed for decisionmaking•Information needed to better understand program/process•Sources •Data collection methods•Timeframe•Resources for evaluation

AudiencesAudiences

FundersFunders

Board of DirectorsBoard of Directors

ManagementManagement

Program StaffProgram Staff

ConstituentsConstituents

7

8

Sources of Information

•Surveys•Interviews•Case Studies•Focus Groups•Existing Materials

Data Collection MethodsData Collection Methods

Questionnaires/SurveysQuestionnaires/Surveys

Quick and nonthreatening; may be able to use Quick and nonthreatening; may be able to use existing data; easy to compareexisting data; easy to compare

Not open ended; lacks complete informationNot open ended; lacks complete information

InterviewsInterviews

Gives comprehensive views; Gives comprehensive views;

Costly; hard to administer; interviewer biasCostly; hard to administer; interviewer bias

Document ReviewDocument Review

DoesnDoesn’’t interrupt the programt interrupt the program

Can take a lot of time; inflexibleCan take a lot of time; inflexible

9

Method Selection CriteriaMethod Selection Criteria

1. What information is needed to 1. What information is needed to make current decisions about a make current decisions about a product or program?product or program?

2. Of this information, how much can 2. Of this information, how much can be collected and analyzed in a lowbe collected and analyzed in a low--

cost and practical manner, e.g., using cost and practical manner, e.g., using questionnaires, surveys and checklists?questionnaires, surveys and checklists?

10

Method Selection CriteriaMethod Selection Criteria

3. How accurate will the information be?3. How accurate will the information be?

4. Will the methods get all of the needed 4. Will the methods get all of the needed information?information?

5. What additional methods should and could be 5. What additional methods should and could be used if additional information is needed?used if additional information is needed?

11

12

6. Will the information appear as credible to decision makers, e.g., to funders or top management?

7. Will the nature of the audience conform to the methods, e.g., will they fill out questionnaires carefully, engage in interviews or focus groups, let you examine their documentations, etc.?

8. Who can administer the methods now or is training required?

9. How can the information be analyzed?

Process Based EvaluationProcess Based Evaluation

How do employees and/or the customers decide How do employees and/or the customers decide that products or services are needed?that products or services are needed?

2. What is needed from employees in order to 2. What is needed from employees in order to deliver the product or services?deliver the product or services?

3. How are employees trained about how to 3. How are employees trained about how to deliver the product or services?deliver the product or services?

4. How do customers or clients come into the 4. How do customers or clients come into the program?program?

13

Process Based EvaluationProcess Based Evaluation

5. What is required of customers or 5. What is required of customers or client?client?

6. How do employees select which 6. How do employees select which products or services will be provided products or services will be provided to the customer or client?to the customer or client?

14

Process Based EvaluationProcess Based Evaluation

7. What is the general process that customers or 7. What is the general process that customers or clients go through with the product or program?clients go through with the product or program?

8. What do customers or clients consider to be 8. What do customers or clients consider to be strengths of the program? strengths of the program?

9. What do staff consider to be strengths of the 9. What do staff consider to be strengths of the product or program?product or program?

15

16

10. What typical complaints are heard from employees and/or customers?

11. What do employees and/or customers recommend to improve the product or program?

12. On what basis do employees and/or the customer decide that the product or services are no longer needed?

17

Australian Breast Cancer National Accreditation Standards

Program AIMS

1. To ensure that the Program is implemented in such a way that significant reductions canbe achieved in morbidity and mortality attributable to breast cancer.2. To maximise the early detection of breast cancer in the target population.3. To ensure that screening for breast cancer in Australia is provided in dedicated andaccredited Screening and Assessment Services as part of the BreastScreen Australia Program.

18

Australian Breast Cancer National Accreditation

StandardsProgram AIMS

4. To ensure equitable access for women aged 50−69 years to the Program.

5. To ensure that services are acceptable and appropriate to the needs of the eligible population.

6. To achieve high standards of program management, service delivery, monitoring and evaluation, and accountability.

19

Australian Breast Cancer National Accreditation StandardsProgram OBJECTIVES

1.

To achieve, after five years, a 70 per cent participation rate in the BreastScreen Australia Program by women in the target group (women aged 50−69 years) and access to the Program for women aged 40−49 years and 70−79 years.

2. To rescreen all women in the Program at two-yearlyintervals.

20

Australian Breast Cancer National Accreditation StandardsProgram OBJECTIVES

3. To achieve agreed performance outcomes which minimise recall rates, retake images, invasive procedures, ‘false negatives’, and ‘false positives’, and maximise the number of cancers detected, particularly the number of small cancers.

4. To refer to appropriate treatment services and collect information about the outcome of treatment.

21

Ways that the Breast Screening Centers Differ

•Structure of the Administration•Quality Assurance Procedures•Protocols for Women with Family History•Use of Clinical Staff•Data Collection Procedures•Forms•Manner of Issuing Results

22

Challenges Addressing the Needs of Symptomatic Women

(1)BreastScreen is not resourced to investigate women with symptoms, all of whom require clinical breast examination (and often ultrasound and biopsy) in addition to mammography.

(2) Women and their general practitioners may be falsely reassured by a negative result and not pursue the essential further investigation.

23

Challenges Addressing the Needs of Symptomatic Women

(3) Cancers in symptomatic women may be incorrectly reported as ‘screen detected’, artificially increasing the reported cancer detection rate of BreastScreen.

(4) ‘Screen-detected’

symptomatic cancers inflate the Breast-Screen average tumour size, as these cancers are likely to be larger than screen-detected asymptomatic cancers, making it more difficult to achieve accreditation standards requiring a high percentage of cancers to be small.

24

What Group of Women Should Not be Receiving Breast Screen

Services?

•High Risk Women due to Family History•High Risk Women with Genetic Mutations•Women with a History of Breast Disease•Women with Previous Breast Cancer (due to need for annual mammography and clinical breast examination

25

Colorectal Cancer Screening Guidelines

•Colonoscopy

•Every 10 years

•Flexible Sigmoidoscopy

•Every 5 years

•Fecal immunochemical test

•Every year

•Guiaic fecal occult blood

•Every year

26

DIFFERENCES BETWEEN TRADITIONAL COLONOSCOPY ANDCT COLONOSCOPY

•No need for sedation for CT colonoscopy•Both require bowel preparation•Faster to perform than traditional

colonoscopy•Lower in cost than traditional colonoscopy•Radiation risks in CT colonoscopy

27

Outcome Based Evaluation: Why Perform One?

•To determine whether your program is making a difference for clients

•To supplement a process based evaluation

•To examine and quantify the impact and/or benefits for clients

•To relate investment in the program to benefits for participants

28

Outcome Based Evaluation Definition and Components

Definition:

Outcome based evaluation examines the effect on clients of programs and services by describing the short term, intermediate and long term impacts.

Components:

•Inputs•Activities•Outputs•Outcomes•Outcome Targets•Outcome Indicators

29

Outcome Based Evaluation Components

Inputs:

Materials and resources that the organization uses to carry out the program:

•Equipment•Staff•Volunteers•Facilities•Donations

30

Outcome Based Evaluation Components

Activities:

Processes that the program uses to meet client’s needs and achieve the program objectives:

•Screening•Education•Counseling•Making Referrals•Collecting data

31

Outcome Based Evaluation Components

Outputs:

Quantifiable units of services that are provided by your program:

•Number of clients screened

•Numbers of clients who received CBE

•Numbers of patients referred to an oncologist

32

Outcome Based Evaluation Components

Outcomes:

The actual benefits or changes that program participants experience during or afterwards:

•Number of patients who have lesions identified•Number of patients who are referred to an oncologist•Number of patients who begin treatment as a result of the program•Number of patients whose breast cancer is identified at an early stage

33

Outcome Based Evaluation Components

Outcome Targets:

The number and/or percentage of participants that you have set as a goal to receive the outcome:

•80% of women eligible will receive a mammogram•60% of women who have cancer identified will be at stage one•90% of women will receive clinical breast examination during routine visits

34

Outcome Based Evaluation Components

Outcome Indicators:

Observable and measurable milestones that mark progress along the path to the outcome:

•Number or percentage of participants who receive a referral to an oncologist

•Number or percentage of participants who receive diagnostic imaging services

•Number of patients who return for a post treatment follow up visit

35

Susan G. Komen Evaluation Guidelines

Evaluation(limit –

3,500 characters)

Describe in detail how the organization (s) :

will measure achieving project goals/objectives &how the impact of the project on the priority selected be assessed.

Describe the evaluation expertise that will be available for this purpose.

What resources are allocated for evaluation in the project budget?

36

Susan G. Komen Evaluation Approach and Components

Approach: Measure both the quality and quantity of strategy implementation and outcomes:

Components:

Impact Evaluation: Assesses the changes that can be attributed to a particular intervention, such as a project, program or policy. Impact Evaluation helps us to answer key questions such as, what

works, what doesn’t, where, why and for how much?

Process Evaluation: Assesses the delivery of programs. Process evaluation verifies what the program is and whether it is being implemented as designed. It answers the questions of what is delivered in reality and where are the gaps between program design and delivery?

37

TIPS FOR COLLABORATIONAND TEAM BUILDING

Begin With Co-Hosting

•Manage a stakeholder process with sponsorship from more than 1

organization

38

Create a Game Plan and Group Covenants

•Discuss how much time people can devote to the group process

•Communicate about alternates, attendance and logistics

39

Concentrate on Relationships First

•Let people get to know each other as individuals

•Share a meal

•Have people explain the impact of decisions or agreement on them and their

community or organization

40

Be Transparent About Decision Making

•Clarify who will make decisions

•Determine how representation will be

established•Ask stakeholders to identify

when they are speaking officially or unofficially

41

Pay Attention to Power

•Establish who has the authority for the group to control what information gets considered

•Who has the authority for the group to decide what’s not

important

42

Create Rituals

•Give members a sense of identify through small habits

•E.g. Songs, Homemade foods, celebrating birthdays

43

Talk About Values

•Talk about the values participants bring to the tables

•Do this before talking about problems, data or potential

solutions

44

Acknowledge Different Kinds of Knowledge

•Acknowledge and support differences in the group in

communication and ways of knowing

45

Generate Multiple Problem Definitions

•All definitions of the problem need to be on the table

No definition is wrong because they all reveal issues and hopes

46

Explore Validity and Accuracy With Care

•All information should be questioned as to its validity, accuracy, authenticity and

reliability•This is true regardless of the

source whether scientific, technical, traditional or cultural