Ces 2013 negotiating evaluation design
-
Upload
cestoronto -
Category
Technology
-
view
312 -
download
1
description
Transcript of Ces 2013 negotiating evaluation design
Negotiating evaluation design in developmental evaluationHeather Smith Fowler, Dominique Leonard, Neil Price
CES 2013
Who we are
Who is SRDC?• Social Research and Demonstration Corporation (SRDC)• Canadian non-profit research firm• Why was SRDC selected to lead the evaluation?
– 20 years’ experience running demonstration projects– 10+ years testing new educational programs– Methodological + content expertise– Presence across Canada
Raising the Grade: Description
35 Boys and Girls Clubs across Canada Funded by Rogers Youth Fund Started in Oct 2012 To date, just under 300 youth enrolled National program BUT focus on local flexibility to adapt Academics + tinkering with tech! http://www.raisingthegrade.ca/
Raising the Grade: major program components
1. Academic support 1:1 mentorship Personal development plans (MyBlueprint), e-portfolios Engaging projects and activites (driven by youth interests) Access to curated, high-quality academic resources
2. Rogers Tech Centres
3. Scholarship program
4. Evaluation and continuous improvement
RTG evaluation
Formative, then summative Logic model development Theory-driven? Collaboration on program development Program monitoring architecture
What were the clues that this was really DE…?
Hierarchy of evidence
Developmental evaluation characteristics
• Focused on and responsive to social innovation (dynamic, flexible, context-specific, unique, interested in differences, timely)
• Rooted in positive, trusting relationships built over time (not always typical of evaluation)
• Timely engagement & rapid feedback• Evaluator as a critical friend• Flexible methods to fit research questions• Still interested in process and quality, but NOT interested in a
linear relationship between goals and outcomes
Evaluation questions
• Is the program effective at…? Does it work?• How is it being implemented across 25-35 Clubs? i.e., what are
they doing?• Is the program a breeding ground for innovation? (What’s new?
How did it come about? What are the contextual factors that contribute to innovation?)
• What are the critical ingredients of the program in reality?• How are youth responding to the program? (what do they like?
What do they think needs to change? What are they doing?)• How are Ed Managers and mentors responding to the program?
Selection criteria for methods and tools
Permit timely feedback Inclusive Engaging and fun Support digital & connected learning principles Empowering Accessible, youth-friendly and authentic (face validity) Available at low-cost Comparison data Support capacity-building goals
… + valid and reliable
Criteria for evaluation designs
Permit timely feedback Match overarching evaluation questions Inform decision-making at different stages Support program principles - equitable, social, participatory;
engagement, empowerment, discovery Authentic and meaningful Low-cost Comparison data
… + rigorous
Insights from our first year
DE can be relevant for certain aspects of a program, while others lend themselves more to tried-and-true methods. Need to discuss with the client which program components are suitable for which types of methods
DE has helped us be creative in imagining new ways to collect program information and reflect it back to program staff & participants, including about their experiences with the program (e.g., participant self-portrait – how to make it evolving and forward-looking?)
Research questions are time sensitive and relative to the stage of the program
Challenges
Being sufficiently critical – it doesn’t come naturally to us (too Canadian!), especially during the stage of building close relationships
Trying to plan workloads and stay on budget when program is evolving and tasks are changing in response to emerging findings, practice
How can we show our added value as evaluators without concrete, rigorous methods that make definitive statements about the program?
How to determine if social innovation is effective? How much can the vision change?
More challenges
How not to slide into old habits/traditional methods and ways of working?
Developing, learning, and using innovative evaluation methods
How to mirror back personal experiences/information about changes to youth without violating confidentiality?
Keeping a long-term perspective
Next steps
Separating out implementation-level questions from overarching questions
What are the priority questions right now (and down the road) and what are the best designs to address those?s
Defining the inquiry framework - Appreciative Inquiry, Actual vs Ideal Comparison
What? So what? Now what?