Ces 2013 negotiating evaluation design

16
Negotiating evaluation design in developmental evaluation Heather Smith Fowler, Dominique Leonard, Neil Price CES 2013

description

Concurrent session #6 (Tuesday June 11th, 15:45-17:15) Negotiating evaluation desigg in developmental evaluation: an emerging framework for shared decision-making by Heather Smith Fowler, Dominique Leonard, & Neil Price

Transcript of Ces 2013 negotiating evaluation design

Page 1: Ces 2013   negotiating evaluation design

Negotiating evaluation design in developmental evaluationHeather Smith Fowler, Dominique Leonard, Neil Price

CES 2013

Page 2: Ces 2013   negotiating evaluation design

Who we are

Who is SRDC?• Social Research and Demonstration Corporation (SRDC)• Canadian non-profit research firm• Why was SRDC selected to lead the evaluation?

– 20 years’ experience running demonstration projects– 10+ years testing new educational programs– Methodological + content expertise– Presence across Canada

Page 4: Ces 2013   negotiating evaluation design

Raising the Grade: Description

35 Boys and Girls Clubs across Canada Funded by Rogers Youth Fund Started in Oct 2012 To date, just under 300 youth enrolled National program BUT focus on local flexibility to adapt Academics + tinkering with tech! http://www.raisingthegrade.ca/

Page 5: Ces 2013   negotiating evaluation design

Raising the Grade: major program components

1. Academic support 1:1 mentorship Personal development plans (MyBlueprint), e-portfolios Engaging projects and activites (driven by youth interests) Access to curated, high-quality academic resources

2. Rogers Tech Centres

3. Scholarship program

4. Evaluation and continuous improvement

Page 6: Ces 2013   negotiating evaluation design

RTG evaluation

Formative, then summative Logic model development Theory-driven? Collaboration on program development Program monitoring architecture

What were the clues that this was really DE…?

Page 7: Ces 2013   negotiating evaluation design
Page 8: Ces 2013   negotiating evaluation design

Hierarchy of evidence

Page 9: Ces 2013   negotiating evaluation design

Developmental evaluation characteristics

• Focused on and responsive to social innovation (dynamic, flexible, context-specific, unique, interested in differences, timely)

• Rooted in positive, trusting relationships built over time (not always typical of evaluation)

• Timely engagement & rapid feedback• Evaluator as a critical friend• Flexible methods to fit research questions• Still interested in process and quality, but NOT interested in a

linear relationship between goals and outcomes

Page 10: Ces 2013   negotiating evaluation design

Evaluation questions

• Is the program effective at…? Does it work?• How is it being implemented across 25-35 Clubs? i.e., what are

they doing?• Is the program a breeding ground for innovation? (What’s new?

How did it come about? What are the contextual factors that contribute to innovation?)

• What are the critical ingredients of the program in reality?• How are youth responding to the program? (what do they like?

What do they think needs to change? What are they doing?)• How are Ed Managers and mentors responding to the program?

Page 11: Ces 2013   negotiating evaluation design

Selection criteria for methods and tools

Permit timely feedback Inclusive Engaging and fun Support digital & connected learning principles Empowering Accessible, youth-friendly and authentic (face validity) Available at low-cost Comparison data Support capacity-building goals

… + valid and reliable

Page 12: Ces 2013   negotiating evaluation design

Criteria for evaluation designs

Permit timely feedback Match overarching evaluation questions Inform decision-making at different stages Support program principles - equitable, social, participatory;

engagement, empowerment, discovery Authentic and meaningful Low-cost Comparison data

… + rigorous

Page 13: Ces 2013   negotiating evaluation design

Insights from our first year

DE can be relevant for certain aspects of a program, while others lend themselves more to tried-and-true methods. Need to discuss with the client which program components are suitable for which types of methods

DE has helped us be creative in imagining new ways to collect program information and reflect it back to program staff & participants, including about their experiences with the program (e.g., participant self-portrait – how to make it evolving and forward-looking?)

Research questions are time sensitive and relative to the stage of the program

Page 14: Ces 2013   negotiating evaluation design

Challenges

Being sufficiently critical – it doesn’t come naturally to us (too Canadian!), especially during the stage of building close relationships

Trying to plan workloads and stay on budget when program is evolving and tasks are changing in response to emerging findings, practice

How can we show our added value as evaluators without concrete, rigorous methods that make definitive statements about the program?

How to determine if social innovation is effective? How much can the vision change?

Page 15: Ces 2013   negotiating evaluation design

More challenges

How not to slide into old habits/traditional methods and ways of working?

Developing, learning, and using innovative evaluation methods

How to mirror back personal experiences/information about changes to youth without violating confidentiality?

Keeping a long-term perspective

Page 16: Ces 2013   negotiating evaluation design

Next steps

Separating out implementation-level questions from overarching questions

What are the priority questions right now (and down the road) and what are the best designs to address those?s

Defining the inquiry framework - Appreciative Inquiry, Actual vs Ideal Comparison

What? So what? Now what?