System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency...

download System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency Wide PBS Conference Kalamazoo RESA October 19 th, 2010.

If you can't read please download the document

Transcript of System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency...

  • Slide 1

System Level Evaluation: Getting Oriented and Getting Started PBS in Kalamazoo as an Example Agency Wide PBS Conference Kalamazoo RESA October 19 th, 2010 Becca Sanders, PhD Evaluator, Kalamazoo Wraps [email protected] Slide 2 Outline Getting Oriented: System Level Evaluation Why invest in evaluation? A normative vs. empirical question How invest in evaluation? A series of shifts with a capital S The Big Three Getting Started: System Level Evaluation Data Collection Just one of the Big Three Slide 3 Why Invest in Evaluation? Whats the most convincing reason for a system to invest in evaluation efforts? Demonstrate outcomes? Ensuring fidelity? Describe services and service recipients? Regulatory compliance of funders/ bureaucracies? Nope to all of the above Slide 4 Why Invest? The Normative Pitch The most convincing overriding reason for investing in evaluation: Systems often fail because stakeholders, at various levels and in various roles, didn't know enough about, have, or use data that could have helped to prevent such failure. Cannon M. and Edmonson A. 2004. Failing to learn and learning to fail (intelligently). How great organizations put failure to work to improve and innovate. Slide 5 In other words Data can help with this! Slide 6 Or maybe this Slide 7 OK, thats the Why in a Nutshell Moving on to the How How Now #1: 2 Shifts with a capital S Changes that need to happen within our own field (evaluation). Stakeholders can help by changing their expectations and perceptions regarding the roles of evaluators. See yall in 10 years this is going to take a while. Slide 8 Shift #1: The Model Shift Goodbye Traditional Evaluation Evaluation of Change Efforts Multiple Change Efforts Practices/ Services Procedures Policies Etc Multiple Change Efforts Practices/ Services Procedures Policies Etc Data Culture Traditional Onlooker In the Mix for a Fix Helps us do the data stuff Slide 9 Shift #2: The Onus Shift Evaluation as beside system problems: part of the solution. versus Evaluation as within system problems: part of the problem and solution. Internalize evaluation functions Regard evaluation as another cog on the wheel that needs fixing Grow a data guided culture across your organization/ systems Slide 10 How Now? The Big Three Getting Evaluation Systems in Place 1) Data Collection: Does the data exist? (Challenges: relevance/ utility, quality) 2) Data Access: Can we access? (Challenges: MIS, power structures, trust, fear, bureaucracy, agency cultures) 3) Data Dissemination: Are we using? (Challenges: simple, timely, relevant, usable) Slide 11 The Big Three Roadmap A Mix of Technical and Adaptive Driving Technical: change in know how the teachables Adaptive: change in social conditions, values and beliefs the less-teachables Theres a whack of adaptive work in system level evaluation Slide 12 The Big 3Highly Technical Highly Adaptive Data Collection Data Access Data Dissemination Roll out on collection Coaching Methodology Help with MIS systems/ database set up Instrument review Data Collection/ entry Technical Assistance MIS system access Analysis Data system management MOU establishment with partnering organizations Catered Reporting Timing Stakeholder driven Data splits Interpretation of research literature Slide 13 Getting Oriented: Relevance to PBS? PBS recognition of the fundamental value of data PBS already regards evaluation as In the Mix for a Fix Data functions fully internalized/ woven into daily operations Numerous folks with data roles at many tables and many levels of systems Decision making in accordance with data at many levels of the system Slide 14 SYSTEMS DATA PRACTICES OUTCOMES Supporting Decision Making Supporting Youth Behavior Supporting Staff Behavior Supporting Social Competence and Knowledge Increase Same point this time in a picture Slide 15 Getting Started: Narrowing The Big Three Conversation 1) Data Collection: Does the data exist? 2) Data Access: Can we access? 3) Data Dissemination: Are we using? Turbo Takeaway Tips for Getting Started with Data Collection in 15 Minutes or Less Slide 16 Getting Started: 3 Kinds of Data Descriptive Process Outcome Tip #1: Go for the low hanging fruit first! Who did you serve? What did you do? Bye bye narrative hello aggregate coding. Slide 17 Descriptive: An Example from ( Thank you Gretchen Lemmer!) Excerpt from the PBS Observation Form Was the PBS poster hung upYN Before the start of the session, were behavior incentives explained? YN Was a GEAR Up Award given at the end of a session? YN Data internalized as part of ongoing operations Capacity building for data based decision making Can help drive TA needs [email protected] Slide 18 Descriptive: Another Example from Family and Children Services (Thank you Maura Alexander!) Excerpt from PBS individual tally sheet.. BEHAVIORSKitchenDining Area Great Room Main Pool Stay Safe Take Responsibility Everyone Be Respectful Positive Interactions Support Others Total for Child: [email protected] Slide 19 Descriptive: Whats going on? Often a snapshot Dichotomous variables Checklist types of measures 3 Kinds of Data Descriptive Process Outcome Process: Whats going south? Capture the nature of process/ implementation over time Subscales Mixed constructs Tip #2: Step ladder advised for process work and use the backdoor if you can find it! Slide 20 Whats the backdoor? When development of a measurement tool defines what a process (or outcome) should look like. The ideal state is revealed in the measure. Bye bye likerts, hello conditional clarity via items in the measurement tool. Whats the front door? When you already have conceptual clarity on what youre hoping to achieve (the outcome) and what happened (the intervention) Systems rarely use the front door I too avoid the front door Slide 21 Example of a Backdoor Process Measure PBS Benchmarks of Quality (BOQ): Measures Level of PBS Implementation Benchmark3 points2 points1 point0 points 11. Behaviors defined Written documentation exists that includes clear definitions of all behaviors listed. All of the behaviors are defined but some of the definitions are unclear. Not all behaviors are defined or some definitions are unclear. No written documentation of definitions exists. www.kalamazoowrapsevaluation.org Slide 22 3 Kinds of Data Descriptive Process Outcome Tip #3: Dont climb the tree without the ladder! Psychometrics- reliability and validity- matter huge! Slide 23 Reliability/ Validity in Outcome Data Junk in=Junk out Reliability: Stability and consistency of a measure (Tip #5: calibrate, calibrate, calibrate) Validity: Ability to capture the intended construct (Tip #6: Do the construct search) Lots of cheesy outcome measures out there Lots of great attempts to develop outcome measures w/out researchers at the table Researchers: create measures Evaluators: bridge research and the field Slide 24 3 Kinds of Data Descriptive Process Outcome Increasingly hard: to measure/ capture well to interpret on the wallet to analyze Grand Finale Turbo Tip #7: When it comes to PBS data collection, invest in the search- not the development of measures. The PBS idea generation machine is huge Slide 25 Relevant & Measurable Indicators Team-based Decision Making & Planning Continuous Monitoring Regular Review Effective Visual Displays Efficient Input, Storage, & Retrieval Evaluation Ready to Get Started? A visual of the PBS Model of Evaluation