Two Men & Some Tinnies Reflections and conversations about internal evaluation practice Brendan...
-
Upload
steven-bradley -
Category
Documents
-
view
215 -
download
0
Transcript of Two Men & Some Tinnies Reflections and conversations about internal evaluation practice Brendan...
Two Men & Some Tinnies
Reflections and conversations about internal evaluation practice
Brendan Moore & John Stoney
Disclaimer
• The following presentation reflects our own perspectives and does not necessarily reflect the position of Baptist Community Services or the Department of Education, Employment and Workplace Relations
Introduction
Two elements1. Brendan & John reflect on their experiences as
internal evaluation practitioners with a focus on making evaluation more effective and influential within their organisations, including;
Context Drivers for Change Organisational responses Practitioner challenges/issues; and (most importantly) learning's
2. In the spirit of participatory evaluation, have a conversation and discussion with participants about the practice and art of internal evaluation.
About your speakers…..
• Brendan• Research and Development
Manager Strategy and Risk Division Baptist Community Services
• John• Assistant Director
Research and Evaluation UnitOffice of Early Childhood Education and Care
Organisational Contexts
Baptist Community Services• two operating divisions – BCS AgeCare
and BCS LifeCare – provides care for thousands of people across NSW and the ACT through more than 160 facilities and programs
• employs more than 3600 staff, has 1000 volunteers
• $200m plus organisation
Organisational Contexts
Baptist Community Services - Then
• R&D Unit staffing of 2.2 FTE• low levels of understanding of the
value the Unit could impart• evident challenges in converting
the information generated into action and strategy – no system for dissemination of findings and improvement.
Organisational Contexts
Drivers for Change• Stakeholder engagement, • Systems thinking, • Monitoring and quality control, • Incorporation of tools such as logic
models into BCS practices, evaluation toolkits.
• Broader reputation strong, so trust created.
• Leadership change – greater scrutiny of detail.
• External analysis of ‘competitors’.
Organisational ContextsResponse• R&D Unit developed a strategy to build
influence & achieve a symbiotic interdependence with our care, support and enabling systems
• initiated changes such as educating and leading BCS’ managers in developing their knowledge of how evaluation can benefit them as decision makers
Organisational ContextsBaptist Community Services - Now• R&D Unit within BCS is now supporting
a large number of evaluations with complex methodological approaches of increasing sophistication.
• The dangers of ‘expert’ language in change efforts.
• ‘Squeaky wheels’ distract from evidence – retreat from objectivity.
• Now have FTE of 6.
Organisational Contexts
Australian Government• Environment of:
• higher expectations of government • rapid change • tight fiscal pressures; and • increasing pressure to deliver policy
and programs in restricted timeframes (Moran 2010)
Organisational Context• "The goal is to transform the APS into a
strategic, forward looking organisation, with an intrinsic culture of evaluation and innovation" (Ahead of the Game, page xi)
• And yet:• There are concerns that key skill-sets in
research and analysis are in short supply (Banks 2009); and
• There is a view that there is insufficient capacity either internally or externally to the APS to undertake the evaluative work that is required (Tune 2010).
Organisational Contexts
‘Then’• Prior to organisational changes the
operational environments worked in has been characterised by:• Ad-hoc (mainly research) projects • Evaluations usually initiated by formal
(e.g. budget) processes• Staff undertaking ‘program
improvement’ projects not realising their R&E nature
Organisational ContextDrivers for Change• These vary depending on the
department and context but include:• Assurance to the Minister• Strong desire to be able to deliver evidence-
based policy and programs • Significant commitment of staff resources to
support implementation of major programs/initiatives
• Dearth of staff with capacity and/or capability in research and analysis
• Most of the evidence base was overseas rather than Australian generated.
Organisational Context
Responses• Establishment of formal, department
wide procedures regarding the commissioning and approval of research and evaluation projects
• Development of a Research and/or Evaluation Plan/Strategy
• Identification and communication around research and evaluation priorities
Organisational Context
Responses• Existence/establishment of a team of
research and evaluation specialists – role varies, either• Secretariat/gatekeeper/vetting; and/or• Provision of an internal R&E consultancy
service for policy and program areas, often:• Underpinned by a ‘participatory’ approach –
projects are still owned and managed by the program or policy area, skill transfer & utilisation (hopefully!) occurs
Organisational Context
Dynamics at the Program level• An evaluation is a formal requirement (hard-
wired via budget processes or an agreement)• Critical thinking/review is being undertaken on
the program by current program managers, leading to research/evaluation activity being initiated
• Philosophical commitment – current program managers have an overarching/inherent interest, belief and commitment to research and evaluation being undertaken.
Organisational Context
Dynamics at the Program level• Assistance is sought because:
• Internal process suggest/encourage it• Internal ‘expert’ is seen as a resource -
• Good practice to access and utilise• Staff feel they do not have sufficient capability
(knowledge/experience/expertise)• Staff feel they don’t have sufficient capacity• Knows what else might be going on
• Previous experiences of using an internal consultant have been positive ones
Reflections
BCS – Organisational change• organisational change theory as it
relates to researchers and evaluators who may be seen to exist at the margins and periphery of ‘core business’.
Reflections
• An implication for evaluation practice is how to shift thinking of leaders to understanding the unique and valued role that evaluation can have in being interdependent in effective organisational management and strategic planning processes.
Reflections
Practitioner challenges• Role Duality/multi-roles/role clarity-
uncertainty• majority of colleagues not having
evaluation or research methodology experience/training
• often strong resource and workload pressures; environment long on work, short on resources
• trying to find a balance between being participative in approach but also get work done – but also;
• “Letting them make their own mistakes”
Reflections
• not overextending my authority (making decisions that may not be OK with the program owners)
• influencing key internal and external stakeholders around technical/practice issues
• Succinctly communicating concepts & knowledge
• the sometimes short transition between not being helpful enough and being too helpful
Reflections
• not being both an program manager and evaluator – no longer immediately aware of program and stakeholder dynamics
• occasionally possibly being seen as something of an outsider rather than insider
• ‘Expert’ expectations/designation (both good and bad!)
• Finding sufficient time (particularly ‘think’ or ‘reflective’ time)
Discussion – Implications for Practice
• Some lessons we have learnt:• How to change things when change is
hard• ‘Create a sense of urgency’• Seize ‘targets of opportunity’ • Meet people where they are at –
participatory approaches to evaluation
• Being a necessity, not a luxury
Discussion – Implications for Practice
• ‘Guests in the organisation’ – internal consultant role.
• Be always alert to both opportunities and threats
• Being ‘ambassador’s’ for evaluation• Constant need for judgement &
building relationships
Discussion – Implications for Practice
• What issues, thoughts and experiences do you have?• Approaches/techniques?• Strategies?• Dilemma's?• Successes?
• How can external evaluators help support the growth of evaluation practice within organisations?