Post on 28-Dec-2015
Looking for Results: Principles for Evaluating
Student Success Initiatives
Presenter: Rick Voorhees
Goals for Today’s Work Together
• You’ll be able to– Identify key questions for evaluating
interventions– Distinguish between different types of
evaluation– Make the link between evaluation and
Continuous Quality Improvement– Identify four components of evaluation– Visualize how a logic model can be a powerful
tool for understanding interventions
The BIG Question
What types of learners change in what evident ways with which influences and resources?
Chances Are . . .
• Everything else equal, what are the chances (probability) that – Males and females progress and graduate at
the same rate?– Racial and ethnic groups progress and
graduate at the same rate?– Financial aid recipients and non-recipients
progress and graduate at the same rate?– Students referred to developmental education
and those that aren’t progress and graduate at the same rate?
Formative Program Evaluation
Formative evaluation: (sometimes referred to as internal) is a method for judging the worth of a program while the program activities are forming (in progress). This part of the evaluation focuses on the process. Permits faculty, staff, and students to monitor how well goals are being met. The main purpose is to catch deficiencies so that the program can readjust while it is in progress
Source: Performance, Learning, Leadership, & Knowledge (n.d.) Retrieved March 25, 2013 at http://www.nwlink.com/~donclark/hrd/isd/types_of_evaluations.html
Summative Program Evaluation
Summative evaluation: (sometimes referred to as external) is a method of judging the worth of a program at the end of the program activities (summation). The focus is on the outcome.
Note: All evaluations can be summative (i.e., have the potential to serve a summative function), but only some have the additional
capability of serving formative functions (Scriven, 1967).
Source: Performance, Learning, Leadership, & Knowledge (n.d.) Retrieved March 25, 2013 at http://www.nwlink.com/~donclark/hrd/isd/types_of_evaluations.html
The Basics of Continuous Quality Improvement
Graphic Source: ww.anzca.edu.au/fpm/resources/educational-documents/guidelines-on-continuous-quality-improvement.html
AAct
CCheck
DDo
PPlan
Four Components of a Culture of Inquiry
Component One“What’s Wrong”Use disaggregated longitudinal cohort data to determine:1)Which student groups are less successful than others (i.e. identify gaps in student success2)Which high enrollment courses have the lowest success rates
Component Two“Why”Collect, analyze, and use data from other sources (focus groups, surveys, literature reviews) to identify the underlying factors (barriers or challenges) impeding student success.
Component Three“Intervention”Use data from Component Two to design new interventions, or revise current ones, to effectively address the underlying factors impeding student success.Review and consider changes to existing practices and policies that impact those factors
Component Four“Evaluation and Modification”Collect, analyze, and use evaluation data to answer1)To what extent did the intervention (including policy changes) effectively address underlying factors? 2)What extend did the interventions increase student success
Source: K.P. Gonzalez. Using Data to Increase Student Success: A Focus on Diagnosis. Retrieved March 12, 2013 at http://www.achievingthedream.org/sites/default/files/resources/ATD_Focus_Diagnosis.pdf
Group 1
Group 2
InterventionPost-Test
Measurement
Post-TestMeasurement
True Experimental DesignPretest-Posttest Control Group Design
Source: Campbell, D.T. & Stanley, J. (1963). Experimental and Quasi-Experimental Designs for Research. Wadsworth Publishing
Random Assignment
TrueDifferences
Group 1
Group 2
InterventionPost-Test
Measurement
Post-TestMeasurement
Quasi-Experimental DesignPosttest Only Control Group Design
Could be a “historical cohort”
Source: Campbell, D.T. & Stanley, J. (1963). Experimental and Quasi-Experimental Designs for Research. Wadsworth Publishing
TrueDifferences
Percentage of Students Persisting By Enrollment in a
Student Success Class
Source: Voorhees & Lee. (n.d.) Basics of Longitudinal Cohort Analysis. Retrieved April 15, 2012 at http://achievingthedream.org/sites/default/files/resources/ATD_Longitudinal-Cohort-Analysis.pdf
Marco Level Cohort
Micro Level Cohort Micro Level Cohort
SSBTN Template
Intervention Level Intervention Level
Cause or Correlated?
Attribution of cause and effect is difficult when working with a highly complex institution with multiple programs and initiatives as well with students from a wide range of backgrounds and current environmental influences.
Sources of Evaluative Data
• Administrative data systems
• Focus groups
• Faculty journaling
• Student journaling
• External surveys
• Institutionally-developed surveys
• Interactions with college services
• Matching external databases
Census Date End-of-Term
Most of What We Do Know About Students Happens Here
Most of What We Don’t Know About Students Happens Here
We Could Do So Very Much Better Here
Academic Terms, Administrative Data Systems, and What We Know About Students
The Academic Term
Resources and Inputs
Activities Outputs Outcomes
Planned Work Intended Results
Assumptions
Logic Modeling:Informing Planning and Evaluation
Impact
Logic Model Elements
Assumptions Resources Activities Outputs Outcomes Impact
The underlying assumptions that influence the program’s design, implementation, or goals
Human, financial, and organizational resources needed to achieve the program’s objectives.
Things the program does with the resources to meet its objectives.
Direct products of the program’s activities: evidence that the program was actually implemented.
Changes in participants’ knowledge, behavior, skills, status, and level of functioning as a result of the program.
Systemic, long-term change as a result of the program (as long as 7-years).
Why Use a Logic Model?
1. Program Design and Planning: serves as a planning tool to develop program strategy and enhance the ability to clearly explain and illustrate program concepts and approach to all college stakeholders
2. Program Implementation: forms the core for a focused management plan that helps identify and collect the data needed to monitor and improve programming
3. Program Evaluation and Strategic Reporting: presents program information and progress toward goals in ways that inform, advocate for a particular program approach, and teach program stakeholders
INPUTS OUTPUTS
Program investments
Activities Participation Short Medium
What we
invest
What we do
Who we reach
What results
Long-term
Logic Models Aren’t Automatically Logic Models Aren’t Automatically LinearLinear
OUTCOMES
Assumptions Outcome
Outputs
Activities
Inputs
Your ResultsYour workYour beginnings
Impact
Logic Model Worksheet