Design for complexity, using evaluative methods

20
How to design programs that work better in complex adaptive systems Presented at Australasian Evaluation Society Annual Meeting, Perth, September 2016 Ann Larson, PhD ann.larson@socialdimensions. com.au

Transcript of Design for complexity, using evaluative methods

Page 1: Design for complexity, using evaluative methods

How to design programs that work better in complex adaptive systems

Presented at Australasian Evaluation Society Annual Meeting, Perth, September 2016

Ann Larson, [email protected]

Page 2: Design for complexity, using evaluative methods

Presented as part of a session titled:

Effective proactive evaluationHow can the evidence-base influence the design of

interventions?

John OwenAnn Larson

Rick Cummings

AES Annual Conference, Perth, 2016

Page 3: Design for complexity, using evaluative methods

Outline

1. Good design principles for creating change in complex settings derived from evaluation findings

2. Reasons why so many designs not incorporate complexity-sensitive design features

3. Examples of good and bad strategies to incorporate into designs

4. Last word on the role of evaluators

Page 4: Design for complexity, using evaluative methods

What works in creating positive change in complex settings?

Evaluations tell us what really happened. The accumulation of evaluation findings amounts to an evidence base that is more sensitive to context

than research findings.

Page 5: Design for complexity, using evaluative methods

• Obtain flexible, long term funding – because change is not linear, continuous nor predictable

• Situate new behaviour in relevant history and saliency of priorities and concerns– because cultures and organisations remain committed to their ways of working

• Build coalitions around a vision for change – because external shocks will require new partners to support long term change

• Understand different actors motivations for behaviour change: introduce accountability and incentives

• Start small, be flexible and experiment before attempting wide-scale change• Balance the need for fidelity with opportunities for lots of local initiative to

promote genuine institutionalisation• Monitor, review and act in a timely manner to be adaptive

Page 6: Design for complexity, using evaluative methods

If it was easy, everyone would do it.

Page 7: Design for complexity, using evaluative methods

Organisational blind spots make these factor difficult to integrate into designs

• Command and control culture of central planning• Structural limitations of processing and responding

to large amounts of data with nuanced implications• Epistemologies, especially in the health sector

Page 8: Design for complexity, using evaluative methods

Designing interventions to change systems are more difficult when there are profound distances between the designer and the intended beneficiaries – whether that distance is geographic, cultural, economic, religious or linguistic.

Evaluation methods can ‘translate’ beneficiaries voices so that decision makers can understand

And human nature is also a barrier

Page 9: Design for complexity, using evaluative methods

Some common approaches to deal with complexity …that are not working

Page 10: Design for complexity, using evaluative methods

• Careful planning does not reduce the likelihood of programs encountering unexpected obstacles and opportunities

• Reliance of monitoring and evaluation plans with high level annual review to guide program implementation and oversight does not facilitate timely adaptation when programs are not working well

• Emphasis on celebrating success rather than learning from failure makes it hard to recognise what needs to be changed

• Use of short time frames and rigid budgets to reduce risk actually makes it more difficult to achieve an outcome

Page 11: Design for complexity, using evaluative methods

And examples of good design trends, using evaluation skills and knowledge

Page 12: Design for complexity, using evaluative methods

Heavily invest in regular review, such as those required employed by TAF/DFAT, USAID’s active monitoring

… but may require a large investment in time and may be overly structured for simple projects or projects that do not usually encounter problems

Page 13: Design for complexity, using evaluative methods

Determine where the bottlenecks for adoption are and delegate responsibility at that level, giving lots of local autonomy, coaching and fostering self-organisation

May require tailoring approaches for different areas or different types of facilities

Page 14: Design for complexity, using evaluative methods

Learn from good pilots and use the inform to expand, while continuing to provide necessary support

One such example is on the next two slides …

Page 15: Design for complexity, using evaluative methods

Traditional expansion

Sarriot et al 2011 Health Policy and Planning

Page 16: Design for complexity, using evaluative methods

vs staged expansion capitalising on lessons learned

Page 17: Design for complexity, using evaluative methods

Payment by results gives the responsibility to implementing agencies and communities to find their own solutions by giving people incentives to do what they want to do. This gives implementers the incentives to experiment until they find a successful strategy. Only works when there is untapped capacity and a meaningful outcome to be achieved

Also requires investment in verification

Page 18: Design for complexity, using evaluative methods

Modelling of likely scenarios during design and at critical junctures using:

• Human-centred design or• Agent-based modelling or• Complex system modelling

All approaches need insights from evaluation.

Page 19: Design for complexity, using evaluative methods

A simpler approach is to model path dependency, looking at the conditions

necessary for the design to work

Page 20: Design for complexity, using evaluative methods

They sentenced me to twenty years of boredomFor trying to change the system from within Leonard Cohen

Evaluators do not need to become implementers internal to an organisation to promote designs that are responsive to complex contexts

As external actors we can point to evidence that flexible, adaptive approaches are more likely to achieve sustainable change for the better.

Last words about the role of evaluators