Process Evaluation: Considerations and
Strategies
CHSC 433CHSC 433
Module 4/Chapter 8Module 4/Chapter 8L. Michele Issel
UIC School of Public Health
Making Widgets
Count not only the widgets,
but
who gets the widgets and
what goes into making the widgets.
Definitions
Systematic examination of coverage
and delivery
Measuring inputs into the program
Finding out if the program has all its
parts, if the parts are functional and
operational
Program Components
Discrete interventions, or groups of
interventions, of the overall program
that are designed to independently
or synergistically effect recipients.
Objectives per component
Types of Implementation Evaluations
Effort: quantity of input Monitoring: use of MIS information Process: internal dynamics, strengths
and weaknesses Component: assess distinct program
parts Treatment: what was supposed to have
an effect
Purpose of Process Evaluation
Assurance and accountability
Understanding of outcomes
Mid-course corrections
Program replicability
From Perspective of:
Evaluator
Funders (Accountability)
Management for the program
Stakeholders and Expectations
Focus on the explicit• Objectives• Program descriptions
Uncover the implicit• Review program theory• Review objectives• Role play possible outcomes
Program Theory Components
Program Theory
Impact TheoryProcess Theory
Service Utilization PlanOrganizational Plan
Organizational Plan
How to garner, configure, and
deploy resources, organize
program activities so that the
intended service is developed and
maintained
Service Utilization Plan
How the intended target population
receives the intended amount of
the intended intervention through
interaction with the program’s
service delivery system
Rigor in Process Evaluation
Appropriateness of the method
Sampling strategy
Validity and Reliability
Timing of data collection
Key Decision: 1
How much effort to expend ~
what data are needed to accurately describe the program.
Choose based on:• Expected across site variation
• Making report credible
• Literature about the intervention
Key Decision: 2
What to look for ~ what program features are most critical,
valuable to describe. Choose based on:• What is most often cited in program
proposal• The budget• What may be related to program failure
Go Back to Objectives
Process objectives per program component:• How much• Of what• Will be done• By who• By when
Components and Objectives
Goal
OutcomeObjective
1
Component Awith ProcessObjectives
OutcomeObjective
2
OutcomeObjective
3
Outcome I
Outcome II
Component Bwith ProcessObjectives
Component Cwith ProcessObjectives
Possible Foci of Process Evaluation
Place: Site, Program People
• Practitioner/provider• Recipient/participant
Processes• Activities• Processes• Structure
Policy
Levels of Analysis
Individuals• Program participants• Program providers
Programs• As a whole
Geographic locations• Regions and state
Types of Questions ?
What was done and by whom? How well was it done and how much
was done? What contributed to success/failure? How much of what resources were
used? Is there program drift?
Sources of Program Variability
Staff preferences and interest Materials availability and
appropriateness Participants expectations, receptivity,
etc Site physical environment and
organizational support
Roots of Program Failure
ProgramInterventions
ProgramInterventions
ProgramInterventions
CausalProcess
Desired Effect
CausalProcess
CausalProcess
Desired Effect
Desired Effectdid not set intomotion
did not leadto
set into
motion
which led to
which would
have led to
SuccessfulProgram
Theory Failure
ProcessFailure
set into
motion
Causes of Program Failure
Non-Program• No participants• No program done
Wrong intervention• Not appropriate for the problem
Unstandardized intervention• Across site, within program variations
Program Failure cont
Mis-management of program operations
Wrong recipients
Barriers to the program
Program components unevenly delivered, monitored
Data Sources from Program
Resources used Participant provided data
• Quality• Match with process evaluation
Existing records• Sampling of records• Validity and reliability issues
Data Sources from Evaluator
Surveys and Interview of
Participants
Observation of Interactions
Survey and Interview staff
Evaluating Structural Inputs
Organizational structure• Supervision of staff• Place in organizational hierarchy
Facilities, equipment Human resources
• Leadership• Training
Measures of Delivery
Measures of program delivery
Measures of coverage
Measures of effectiveness
Measures of Implementation
Measures of Volume (Outputs):• Number of services provided
Measures of Workflow:• Client time• Staff work time
Targets, Recipients, and Coverage
TARGETAUDIENCE
RECIPIENTS
NON-RECIPIENTS
IN NEED
NOT INNEED
NOT INNEED
IN NEED
Measures of Coverage
Undercoverage= # recipients in need /# in need
Overcoverage=# recipients not in need /# recipients
Coverage efficiency=(under - over) x 100
Measures of Effectiveness
Effectiveness Index =% reached per program standard per
program component
Program Effectiveness Index =Sum of Effectiveness Indexes/#
program components
Bias in Participation
due to self-selection results in under or overcoverage may be related to recruitment can be identified with good data
collection (monitoring)
Measures of Efficiency
ratio of input per output productivity per staff, per cost, per
hour cost per participant, per
intervention etc...
Evaluating Costs
Payments by agency Payments by secondary funders Payments by participants
versus charges!
Monitoring and CQI
Similar types of data presentation• Control charts• Fishbone diagrams• Flow charts• Gantt charts• etc.
Overlapping purposes
Reaching Conclusions
Compare data to objectives
Compare data to needs assessment data
Compare data to other sites or other programs
Worksheet Exercise
For each program objective:• What is the focus and level of the
process evaluation• What data sources needed• Who collects data
References
Rossi, Freeman & Lipsey (1999). Evaluation: A systematic approach. Sage Publications
Patton (1997). Utilization focused evaluation. Sage Publications.
King, Morris, Fitz-Gibbon (1987). How to assess program implementation. Sage Publications.
Weiss (1972). Evaluation Research. Prentice Hall
Top Related