Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent...

18
Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion Mireille Matt INRA - GAEL

Transcript of Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent...

Page 1: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Basic questions of policy evaluation design and tuning: a quick reminder

presented byVincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS

DiscussionMireille MattINRA - GAEL

Page 2: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Outline of the discussionObjective: to discuss the similarities and specificities of the basic questions related to public policy evaluation vs. research result or S&T policy evaluation• Brief summary of paper• Similarities with assessing impact of public research/S&T policy• Specificities of assessing impact of public research/S&T polic

Page 3: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Brief summary (1)

• Design and implementation of the interministerial evaluation of the national policy for road safety (control and sanction for departing from the highway Code)=> drastically reduce the number of deaths on the road

• Very detailed chronological story with a socioanalytical dimension (1999-2003)

• Narrates how, with his social scientist position, he was able to orient and structure an uncertain and chaotic process of policy evaluation and influence the subsequent reforms

Page 4: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Brief summary (2)• Phase 1: struggle with different departments in French

Ministries (Dpt for modernization and deconcentration, Dpt for security and road circulation…) and the National council for evaluation – back-and-forth => definition of the object to evaluate, the objectives, the perimeter of the evaluation, public authorities and actors concerned

• Phase 2: implementation of the evaluation project: design and specifications of the evaluation, methods to be used, external scientific expert – managing the interaction between the scientific hemisphere (the questionning of the evaluation) and the politico-administrative hemisphere (governance of the evaluation process) – Ministries = major stakeholders

Page 5: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Brief summary (3)• Phase 3: Epilogue – Change of the head of the National Instance of

Evaluation (budget, definition of specifications, call for proposals…) - choice of winner – steering committee – aim = define a new policy of control-sanction and document all possibilities – wise and persuasive entrepreneur for public policy design=> changes in public policies towards road safety in 2003

• The results of this evaluation is due to the head of National Instance of Evaluation – political strategies – involvement social scientists in the evaluation process –

• Political dimension of an evaluation along the whole process. How should social scientists involved in this type of process behave in front of this omnipresent political dimension

Page 6: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Similarities with assessing impact of public research

Page 7: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Evaluation at different steps of the process of public actions

CONCLUSION

DESIGN OF THE INSTITUTIONAL ARRANGEMENT- tools, modalities, rules, … -

IMPLEMENTATION

ANALYSIS OF THE ENVIRONMENTALCONDITIONS

DEFINITION OF GOALS

REALIZATION

Objectives

Institutional arrangement

Effects

time

Page 8: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

THE DIFFERENT LEVELS OF EVALUATION: a complex process analysed by VS

Objectives Institutional arrangement

Effects

Environmental

economic,

social,

political

etcconditions

E1

E2 E3

E4E5 E6

E7

E8

E9

E1 - E4 : Relevance (content) and "quality" of conception (decision process)E1 : Relevance of objectivesE2 : Coherence of objectivesE3 : Relevance and coherence of the "institutional arrangements"E4 : Coherence between objectives and institutional arrangements

E5 - E7 : Implementation and resultsE5 : Programme management (cost-timing-quality…) = monitoringE6 : Effects / outputs / impacts = effectivenessE7 : Match between effects and objectives = efficacy (1)

E8 - E9 : Efficiency and legitimizationE8 : Match between objectives - institutional arrangements and effects-outputs-impacts : do the same / do better another way = assessing adequate funding, management, contractual behaviour in order for objectives to be achieved in a cost-effective manner = efficiencyE9 : Ex-post relevance of objectives, given the results of the other evaluations = assessing whether initial objectives are still valid in the light of evolving RTD, societal and environmental conditions = efficacy (2)

Page 9: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

General issues towards a « good » impact evaluation system

• Impact assessment is only one of the levels of evaluation

• Complexity : no single best method ; multi-disciplinarity /quantitative and qualitative information; WITH alignment between (implicit or explicit) theoretical basis and which impacts are to be evaluated

• Guaranteing the scientific value of evaluation methods (Robustness, Repetability, Appropriability, Transparence, Independance of the evaluators, Confidentiality, Sampling…)

• Balance between systematic / standardized /simple approaches and exploratory studies

=> systematic issues: Evaluation vs monitoring, time fit between R&D activity and evaluation, Availability of data, Evaluation “fatigue“, Cost of evaluation

Page 10: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

General issues towards a « good » impact evaluation system

•Ensuring that evaluation takes place on a programmed and properly resourced basis

•Providing « easy to understand », « usable », « credible » results while avoiding meaningless list of indicators / scoreboards

•Providing a mechanism for feedback of the results into policy making (learning policy maker)

•Interactions between academics, practitioneers, policy makers and research actors for better understanding of scope, relevance and needs => evaluation is a « social process »

Page 11: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

General issues towards a « good » impact evaluation system

Evaluation as a « social process » (cf. L. Georghiou )Motives / interest of the actors involved in th evaluation process

Those « being » evaluated justification/legitimation learning at operational level gain new supports for public sources

Those who are the audience of the evaluation accountability resources allocation learning at policy level (pro-active evaluation)

Those performing the evaluation academic interest consultant business

Make purpose and context-dependancy clear before choosing an approach

Page 12: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Specificities of assessing impact of public research

Page 13: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Challenges for an « ideal » evaluation system

Trends in S&T policy making

Multi-level multi-agent decisionVariety of goalsVariety of Inst. Arrang.Coupling with other policies

International collaboration / integration (EU)

Flexibility / adaptive - learning policy makers

Development of competition-based programmes

Trends / challenge for evaluators

Complexity/Separability/AttributionMultiple stakeholders

–Coherence of objectives–Weight of different dimension of the evaluation–Handling/using the results

Evaluation of policy mix

Mix of « evaluation cultures »

Evaluation of shifts

Legitimacy, project fallacy

Page 14: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Challenges for an « ideal » evaluation system

Globalization

Solving problemorientation

Interdisciplinarity

CooperationS-I linkages

Increasing knowledge content

IPR regulations

Trends in research activityTrends / challenge for evaluators

International dimensionBenchmarking

Bias toward short termmarket-type output

Limits of peer reviews

Separability/Complexity/AttributionNetwork evaluation

Evaluation of knowledge/competences/capabilities / capacity

Limits of bibliometrics

Page 15: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

ASIRPA Challenges

• Evaluation of an organization with multiple missions/objectives

• Case study approach with quantification• Identifying the cases within the organization• Considering a variety of impacts (measure)• The observed impact is the result of a complex

process involving a heterogeneous set of actors evolving over a long period of time => impact pathway

Page 16: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

ASIRPA Challenges

• Attribution vs Contribution• Problem of project fallacy• Develop a « standardized » method for

presenting and anlysing the case (impact pathway, chronology, impact vector, transversal analysis)

Page 17: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

Questions?

• What could we learn from your experience ? In terms of method used to evaluate the

impact of the road safety policy (by consultant)? In terms of actors performing the evaluation of

impacts? Consultant vs Academic actors In terms of mechanisms used to feedback the results

into policy actions? In terms of transferring the method developped?

Page 18: Basic questions of policy evaluation design and tuning: a quick reminder presented by Vincent Spenlehauer – Ecole des Ponts ParisTech - IFRIS Discussion.

THANK YOU FOR YOUR ATTENTION !