Evidence for Social Policy and Practice

download Evidence for Social Policy and Practice

of 36

Transcript of Evidence for Social Policy and Practice

  • 8/6/2019 Evidence for Social Policy and Practice

    1/36

    April 2011

    Evidence or Social Policy and PracticePerspectives on how research and evidencecan infuence decision making in public services

  • 8/6/2019 Evidence for Social Policy and Practice

    2/36

    Evidence or Social Policy and PracticePerspectives on how research and evidencecan infuence decision making in public services

    Foreword

    From criminal justice, childrens services to poverty reduction, this report contains essays romorganisations using dierent methodologies and approaches to generate evidence and infuencepolicy and practice in a number o service areas.

    The idea that policy and practice should be underpinned by rigorous evidence is internationally

    accepted, yet there is recognition that the level o rigour in evaluating what works in social policyremains limited. In a time o public service reorm and more decentralised decision making, theneed or timely, accessible and reliable evidence is becoming ever more important.

    At NESTA we both use and produce evidence through a combination o practical programmesand research projects. We are keen to learn rom the organisations eatured here as well asothers working in the eld to help strengthen the evidence base and improve the sharing o thisknowledge.

    We recognise that with local autonomy comes an enormous opportunity or innovative approachesto thrive, but in order or this to happen practitioners, commissioners, users and other decisionmakers must know what works and what doesnt. Sharing evidence will help improve outcomes

    whilst helping to prevent duplicated eorts and wasted resources.

    Dierent policy areas will undoubtedly need dierent support and development in the production,dissemination and use o research and evidence, with many organisations having a key role to play.At NESTA we are working to orge connections with a range o organisations and explore possibleways orward. As you will see rom the essays there is much activity happening and, beyond theexamples included here, there is a great deal o interest and enthusiasm in progressing this agenda.

    We would welcome your thoughts.

    Ruth Puttick

    Policy Advisor, Public and Social Innovation, NESTA

    April 2011

    NESTA is the UKs oremost independent expert on how innovation cansolve some o the countrys major economic and social challenges. Its work isenabled by an endowment, unded by the National Lottery, and it operatesat no cost to the government or taxpayer.

    NESTA is a world leader in its eld and carries out its work through a blendo experimental programmes, analytical research and investment in early-stage companies. www.nesta.org.uk

  • 8/6/2019 Evidence for Social Policy and Practice

    3/36

    Contents

    Evidence or Social Policy and Practice

    Perspectives on how research and evidence can infuence decision making in public services

    Part 1: Project Oracle 4

    Mat Ilic, Greater London Authority andStephen Bediako, The Social Innovation Partnership

    Part 2: Combining research and practice: The Center or Court Innovations approach 9

    to justice reorm

    Greg Berman and Aubrey Fox, Center or Court Innovation

    Part 3: Washington State Institute or Public Policy 14

    Summary produced by NESTA

    Part 4: Using impact evaluations to inorm policy 18

    Iqbal Dhaliwal and Caitlin Tulloch, J-PAL

    Part 5: RANDs Promising Practices Network 24

    Rebecca Kilburn, RAND Corporation

    Part 6: The Obama Administrations evidence-based social policy initiatives: 28

    an overviewRon Haskins, Advisory Board andJon Baron, President, Coalition or Evidence Based Policy

  • 8/6/2019 Evidence for Social Policy and Practice

    4/36

    1. Introduction

    This paper has been written against a backdrop o change in public services, taking place at anunprecedented speed and scale. The Government has committed to a series o wide-rangingreorms, with a renewed ocus on achieving better social outcomes at a reduced cost. Some o themore challenging debate now is about what constitutes reliable evidence o social change, and howoutcomes or results o specic interventions can be veried.

    In the United Kingdom there is rare cross-party consensus on the benets o early intervention. Forexample Iain Duncan Smith MP and Graham Allen MP have progressed the what works agenda.

    The Allen Review1 recommended a portolio o evidence-based, early intervention programmesaimed at improving outcomes or children, young people and their amilies;

    The Centre or Social Justice identied in a recent report ...a atal ailure at the heart ogovernment spending decisions.2 namely the absence o adequate evidence in the constructiono social policies and programmes, in turn leading to poorly designed provision with little or noproo o positive change.

    In addition, with the onset o an era o localism (or mass localism3) where individuals andcommunities need to come together to resolve their problems with local solutions, one is entitledto ask how communities and individuals will obtain the knowledge and evidence they need to makesound decisions.

    As a response, the Greater London Authority (GLA), with support rom The Social InnovationPartnership, contributes this paper on Project Oracle. The project, having emerged rom theMayors proposals in Time or Action (his programme to tackle serious youth violence), seeks to:Understand and share what really works4 in supporting young people and preventing youthviolence. It does this by combining, or use by service providers, unders and policymakers:

    A tried and tested programme evaluation methodology.

    A technical platorm the Project Oracle website.

    A supportive centralised ocer resource.

    4

    Part 1: Project OracleMat Ilic, Greater London Authority and Stephen Bediako,The Social Innovation Partnership

  • 8/6/2019 Evidence for Social Policy and Practice

    5/36

    2. The origins o Project Oracle

    Project Oracle seeks to urther the aims o evidence-based policymaking by stimulatingcollaboration between government, academia and the wider social intervention community. It wasestablished in recognition o our key actors:

    1. There is currently no clear understanding o what programmes5 work, in what

    conditions they work, and whether they thereore represent value or money.

    2. There needs to be a sustainable body o evidence so that the knowledge base evolvesor uture policymaking.

    3. Evidence needs to be grown rom somewhere using a consistent method, requiring astimulus and a mechanism or providers to develop continuously.

    4. The root causes o social problems require a deep understanding that should be capturedand documented, particularly given the pace o social change.

    Project Oracle emerged as one o the six work streams rom Time or Action. The project seeks toestablish a standard or evidence-inormed decision making on children and young peoples policy

    in London. The Mayor believes that he is: Strategically best placed to initiate joint work toevaluate programmes that have greatest benefts and to identiy those that dont...6 Recognisingthe vital contribution that partners in policy, unders, service providers and academics can make,the projects Delivery Board represents cross-sector leadership.

    The challenge or Project Oracle is how to build the evidence rom London-based activities,avoiding the temptation o importing proven interventions rom other countries and contextswithout urther validation.

    3. Project Oracle: an overview

    How Oracle works (a summary):

    Standards o Evidence or London were commissioned by the GLA and delivered by DartingtonSocial Research Unit, in collaboration with international experts such as Delbert Elliot (Universityo Colorado) and Steve Aos (Washington State Institute or Public Policy).

    The Standards document has been translated into a set o practical activities that can beconducted by providers o social programmes. These activities are part o a continuousimprovement process that involves providers going through an online sel-assessment, whichtests the robustness o the evidence in support o their programme(s).

    The sel-assessment is carried out by the provider alongside a practitioner guidebook. Dependingon how their project scores (Levels 1-5, with 1 being entry level), practitioners are expected toundertake an improvement plan aimed at increasing the level o evidence that supports theirclaim to positive outcomes.

    This sel-assessment activity requires a degree o human interaction and peer challenge, whichin the current pilot stages is acilitated by a dedicated project ocer rom the GLA, with supportrom Masters students rom social research disciplines. The GLA also runs monthly workshopsand will engage in online collaboration once the Oracle website has beneted rom urtherdevelopment (April 2011).

  • 8/6/2019 Evidence for Social Policy and Practice

    6/36

    4. Standards o evidence

    In the context o youth violence prevention studies, the current state-o-the-art standardso evidence methodology can be traced back to initiatives rom the Center or the Study andPrevention o Violence, University o Colorado, USA, where the Blueprints research and evaluationprogramme was begun. In 2009, the GLA commissioned Dartington Social Research Unit to takeorward the experience o evidence-based programmes rom the United States, working with

    experts in the eld to put together the standards needed or London.

    In the absence o any standards it is likely that many social interventions or children and amiliesare being used in ignorance; worse, some are being implemented in spite o indications that theycan be harmul. Better inormation, communicated and used well, should mean that resources arecondently directed toward more eective projects to address the right problem. Widespread use othe London Standards should bring greater consistency to judgments about how to support youngLondoners and prevent violence.

    This is a long-term ambition. The GLA recognises that the London Standards will need their ownbody o knowledge to refect the capitals complex social and political environment and culture.

    The London Standards o Evidence can be summarised in the ollowing way:

    5. Making Oracle practical

    The academically rigorous ramework proposed by Dartington is valuable, but its ambitious goals orandom controlled trials, longitudinal studies, and so on, are a long way away rom the realities oprojects aimed at children and young people in London.

    Translating the Standards document or the use o mostly rontline providers not trained inevaluation had to be a creative process in itsel. It required consultation with a very wide rangeo groups across London, as well as input rom external advisers, notably Proessor Betsy Stankoand other members o the Oracle Challenge Panel, Community Links, and The Social InnovationPartnership.

    The Standards, with urther development, have introduced the concept o a theory o change,an approach to planning or articulating social change developed in the US over the last 25 years.It is an interactive method or obtaining group and community consensus on the long-term goal,preconditions, and assumptions behind proposed social programmes. The GLA tested this approachin sessions with Oracle pilot projects (see below), and in ten workshops with 110 participants rom78 organisations rom across London.

    Ongoing support is necessary because achieving improved evidence o eectiveness is adevelopmental process, which requires those involved in commissioning and delivering social

    Table 1: The Oracle levels o evidence

    Level 1 Sound theory o change or logic model, with clear plans or evaluation

    Level 2 Demonstrating emerging evidence o impact

    Level 3 Eective comparison group (ideally random controlled trials), statistical analyses, eect sizes

    Level 4 Model analysis o dosage, o delity o implementation

    Level 5 System-ready multiple independent replication evaluations, and cost-benet analysis

    NB: These standards have been summarised or the purpose o this report.

  • 8/6/2019 Evidence for Social Policy and Practice

    7/36

    change projects to work together. To date this support has been provided by a single project ocerand Masters students on work placements, but it is anticipated that as the volume and quality oevidence rom providers improves, the quality o these resources must also be enhanced.

    6. The pilot approach

    In September 2010, the Oracle Challenge Panel7 agreed to initial work with ten Oracle pilotprojects. These were current, mostly voluntary sector projects, varying in both size and prominence,but all working with the aims o combating youth crime and violence.

    The pilot has involved working intensively with these organisations, and has allowed the GLA toconsider the benets o Project Oracle through the eyes o unders, practitioners and academics.It has pointed to common issues that evidence and evaluation-building exercises ace in thereal world. This includes actors such as a recognised lack o proactive unding or research andevaluation, as well as poorly designed (and thereore unpersuasive) research.

    7. Oracles applications

    Three scenarios have been identied in which Project Oracle and its methodology can make adierence:

    1. When commissioners are buying new services or developing new products in-house(thus the GLA is working with London Funders to move towards commissioning along Oraclelines).

    2. Helping statutory organisations to recongure their operations or programmes(thishas already started with GLA amily organisations, including the Metropolitan PoliceService).

    3. Improving projects or programmes directly and championing a single standard o

    evaluation (where Oracle works directly with providers to improve their evidence base).

    Throughout the development o the project, the GLA has ocussed on simplicity and clarity, toensure that all concerned can successully make the journey to improved outcomes or youngLondoners.

    8. Activities proposed or 2011-2012

    The GLA expects to embed the principles o this project across organisations which are a parto the GLA group, thus leading by example. In 2008-2009, youth engagement and programmeexpenditure in the GLA group was estimated to be in the region o 40 million, which makes anyeort to make it better a worthwhile exercise in its own right.

    Through its London-wide Delivery Board, the GLA aims to build a sustainability strategy or ProjectOracle with the intention o advancing the initiative through the support o its key partners acrossLondon. This work will detail the lessons learned rom the project and explore whether and how theinitiative can be sustained in the context o a changing political and social environment.

    9. Conclusions

    NESTA has put orward or consideration Ten Steps to Transormation,8 one o which is innovationculture only comes rom practice. This is not about creating an innovative culture or its own

  • 8/6/2019 Evidence for Social Policy and Practice

    8/36

    sake, but a culture where people eel empowered and supported to aect change and adapt theirpractice. Empowering involves providing advice and practical support. In this way, Project Oracleoers a pathway rom practice to evidence, enabling projects to refect on what made them (or canmake them) succeed and why.

    An important question to conclude with is: what is the purpose o knowledge in seeking socialchange? When demonstration projects are unded as a test or national solutions (as NESTA and

    others do), how does the learning get retained or uture decision-making activity? I a promisinginitiative is identied, what are the chances that it will be sustained?

    The GLA aims or Project Oracle to be a source o inormation, evidence and inspiration or city-wide policymaking and decision making. The model may become one that other cities could adopt.There is no reason why, with time, the wider public cannot be empowered to scrutinise investmentin social programmes by being given access to inormation about what works. I uture resourcesare directed in this way, the outcomes or young Londoners should improve more predictably overtime.

    Endnotes

    1. Allen, G. (2011) Early Intervention: The Next Steps. London: Cabinet Oce.

    2. Centre or Social Justice (2011) Outcome-Based Government. How to improve spending decisions across government.London: Centre or Social Justice.

    3. Bunt, L. and Harris, M. (2010) Mass Localism: a way to help small communities solve big social challenges. London:NESTA.

    4. Mayor o London (2008) Time or Action: Equipping Young People or the Future and Preventing Violence theMayors proposals and call to partners. London: Greater London Authority.

    5. Specically London-based programmes aimed at beneting children and young people.

    6. Mayor o London (2008) Time or Action: Equipping Young People or the Future and Preventing Violence theMayors proposals and call to partners. London: Greater London Authority.

    7. An advisory body to the project consisting o expert practitioners and academics.

    8. Bunt, L. and Puttick, R. (2010) Ten Steps to Transormation. NESTA Blog. See: www.nesta.org.uk/blogs/ten_steps_to_transormation

  • 8/6/2019 Evidence for Social Policy and Practice

    9/36

    1. Summary

    The Center or Court Innovation is a public/private partnership dedicated to reorming the justicesystem through demonstration projects, original research, training and technical assistance. In NewYork, the Center or Court Innovation unctions as the State court systems independent researchand development arm studying chronic problems, devising new solutions and testing theireasibility. Ater the Center has tested a model and documented its eectiveness, it then looks todisseminate the lessons more widely, helping reormers around the world adopt new ways o doingbusiness.

    2. Background

    Part 2: Combining research and practice: The Center

    or Court Innovations approach to justice reormGreg Berman and Aubrey Fox, Center or Court Innovation

    DemonstrationField-testing new ideas

    DocumentationTracking the impacts of experiments

    ReplicationHelping government officials adopt lessons from demonstration projects

    InstitutionalisationNew practices become embedded within the justice system

    Figure 1: The Center or Court Innovations business model

  • 8/6/2019 Evidence for Social Policy and Practice

    10/36

    0

    The Centers rst demonstration project, the Midtown Community Court, was created in 1993 toaddress low-level oending in the Manhattan neighborhoods around Times Square. The MidtownCourt combines punishment and help, sentencing misdemeanor oenders to perorm communityservice and receive social services. The goal is to reduce both crime and the use o incarceration.Independent evaluators rom the National Center or State Courts conrmed that the communitycourt helped to curb street crime: prostitution arrests dropped by 56 per cent and illegal vending by24 per cent ater the court opened. Other results included improved compliance with court orders

    and reductions in case processing time. In addition, more than two out o three local residentssurveyed in a telephone poll said that they would be willing to pay additional taxes to support thecommunity court.1

    3. Current activities

    The Centers next experiment was the Brooklyn Treatment Court. The rst such court in New YorkCity, Brooklyn Treatment Court worked with more serious cases: elony oenders with long historieso addiction. Following a model originally established in Florida, participants were linked to long-term drug treatment in lieu o incarceration. Progress in treatment was regularly monitored by ajudge using a system o sanctions and rewards. Researchers rom the Center or Court Innovation

    tracked the perormance o drug court participants or three years ater they let the program andound statistically signicant reductions in both substance abuse and recidivism. Based in no smallpart on these ndings, the New York State Court System made an institutional commitment tospread the drug court model statewide.

    The drug court model also attracted the attention o the executive and legislative branches ogovernment in New York. In April 2009, the Governor o New York signed into law a signicantrevision o the inamous Rockeeller Drug Laws, long regarded as the toughest in the United Statesor mandating prison or non-violent drug oenders. One o the explicit goals o this reorm, whichwas celebrated by the Governor with an event at the Brooklyn Treatment Court, was to increase thenumber o deendants who would participate in drug court.

    192 drug courts

    8 community courts

    41 domestic violence courts

    28 mental health courts

    55 integrateddomesticviolence courts

    8 sex offendermanagementcourts

    Numbers include projects in operation and planning as of January 2010

    All 62 counties have at least one problem-solving court

    Figure 2: Problem-solving courts in New York

    Source: Center or Court Innovation.

  • 8/6/2019 Evidence for Social Policy and Practice

    11/36

    1

    While community courts and drug courts are the most prominent o the programs that the Centeror Court Innovation has implemented, they are ar rom the only ones. The Center has also beenresponsible or establishing New Yorks rst mental health court, domestic violence court, reentrycourt and others. While each o these projects is unique, they have come to be known collectivelyas problem-solving courts or their eorts to address the underlying issues that bring deendantsinto the justice system.2 Over the past 15 years, problem-solving courts have been widely replicatedthroughout New York State (see Figure 2). Most o these programs have either been based on

    models created by the Center or Court Innovation or created with training and technical assistancerom the Centers team o expert consultants.

    4. Structure o the Center or Court Innovation

    The Center or Court Innovation is a non-governmental organization that is operated as a projecto a larger non-prot agency, the Fund or the City o New York. In scal year 2010, the Centerhad a budget o $17.6 million which was underwritten by a range o unders at the city, state andederal level (87 per cent o the Centers revenues come rom government grants, 13 per centrom private oundations and ee-or-service contracts). The Center or Court Innovation seeks tocombine action and refection, doing and thinking. It brings together proessionals rom an array

    o disciplines (attorneys, social workers, mediators, victim advocates, social scientists, journalists,technologists and others) to promote justice reorm. Broadly speaking, the Centers 175 ull-timeemployees work in our principal areas:

    1. Demonstration projects. As described above, the Centers primary business is the planning,implementation and operation o demonstration projects, and has been responsibleor creating 20 dierent model projects. These vary in size (or example, the Red HookCommunity Justice Center hears several thousand cases each year while the BrooklynMental Health Court less than 100) and ocus (or example, the Brooklyn Domestic ViolenceCourt handles serious elony cases involving intimate abuse while the Harlem Youth Courtdeals with minor oenses committed by juveniles).

    2. Research. The Center is committed to the idea o action research and employs a teamo in-house researchers that regularly monitors the work o the Centers demonstrationprojects, tracking impacts on street crime, substance abuse, sentencing practice, levels oneighborhood ear and public trust in justice. Researchers rom the Center also conductindependent evaluations with national policy implications, including a multi-year evaluationo court-mandated drug treatment, a randomized trial o the eectiveness o battererintervention programs in domestic violence cases and an examination o the extent to whicha judges practice is changed ater assignment to a problem-solving court.

    3. Strategic communication. The Center uses a number o strategies to spread inormation,including a website3 that attracts 82,000 unique visitors per month, and clearly writtenhow-to manuals and best-practice guides or criminal justice ocials (more than 50,000publications are downloaded each month). In addition, the Center produces a monthlypodcast, New Thinking, which eatures interviews with leading criminal justice thinkers.Sta rom the Center have published numerous books,4 articles in the popular press,trade publications and academic journals; and regularly make presentations at nationalconerences and workshops.

    4. Technical assistance/training. The Center provides hands-on, expert assistance toreormers judges, attorneys, probation ocials, community organizers and others romaround the world, oering guidance on assessing public saety problems and cratingworkable, practical solutions. Structured site visits to the Centers demonstration projects area key part o the organizations approach to technical assistance; over the past ve years,they have hosted an average o 625 visitors per year. The Center is currently working withinnovators both in the United States and abroad to help create new responses to problems

    like drugs, domestic violence, delinquency, and neighborhood disorder.

  • 8/6/2019 Evidence for Social Policy and Practice

    12/36

    2

    5. Lessons rom the Center or Court Innovation

    The Center or Court Innovations approach to promoting evidence-based justice sector reorm ismulti-aceted. Key elements include:

    Balancing independence and access. The Center or Court Innovation has sought to workclosely with government while being ormally independent rom it. Working with government

    decision makers helps to ensure the relevance o the organizations work. Yet the Centersindependence rom government grants it the reedom it needs to think beyond the electoralcycle and to pursue a long-term vision o justice reorm. The Centers independence romgovernment also means that it does not operate under some o the institutional constraints (e.g.civil service regulations, union rules) that hamper eorts to create an entrepreneurial culturewithin government. Finally, the Centers independence gives it the room it needs to issue ndingsthat are less than positive. For example, the Centers randomized trial that examined the useo batterers intervention programs in the Bronx ound no evidence o impact on the behavioro oenders. Although this nding directly questioned common practice o local judges, theCenters study was not suppressed. Rather, it was eatured in a ront-page story in the New YorkLaw Journal and in multiple journals.

    Combining research and practice. Having researchers work alongside criminal justice

    practitioners has multiple benets or the Center. Firstly, it orces practitioners to think morerigorously. In particular, it encourages those who plan and implement demonstration projects tobe more disciplined about articulating measurable goals and objectives or their work. On theother side o the coin, researchers benet because they become grounded in the messy realitieso day-to-day implementation, which makes their work more nuanced and their writing easierto read. Unlike some academic evaluators, researchers rom the Center also know enough not tohold new programs to unrealistic standards o perormance.

    Bridging the local and the national. The Center or Court Innovation has always had one ootin the world o local practice and one oot in the world o national policy. The Centers sustainedengagement on the ground in New York has given it credibility and enabled it to build trust withlocal practitioners and policymakers. But the Center has a broader worldview than the typical

    local organization and its national reach (and connections) means that it can bring ideas romacross the country to New York. For example, the Center recently adapted the Ceasere anti-violence program that has shown success in reducing gun crime in Chicago to the Brooklynneighborhood o Crown Heights.

    Using multiple methods o analysis. While many researchers extol the importance orandomized trials, these trials rarely happen in the world o criminal justice. Instead the Centerbelieves that randomized trials are not the only kinds o studies that have value and that thereis a lot to learn rom other types o research, including quasi-experimental designs, qualitativestudies and process evaluations. The Center is also committed to moving beyond a pass-ailapproach to evaluating social programs, involving not just tracking the impact on crime rates butexamining a much wider set o program outcomes, such as compliance with court orders, impactson system ecacy, public condence in justice, whether court litigants eel the justice system istreating them airly.

    6. Wider infuence and impact

    The idea o problem-solving justice has spread ar beyond the borders o New York. MainstreamAmerican legal organizations, including the American Bar Association and the Conerence o ChieJustices, have endorsed the concept. On the campaign trail, Barack Obama explicitly endorsed drugcourts, making reerence to the Centers research, saying: Drug courts have proven successul indealing with non-violent oenders. These courts oer a mix o treatment and sanctions, in lieu o

    traditional incarcerationThe success o these programs has been dramatic: One New York study

    ound that drug court graduates had a re-arrest rate that was on average 29 per cent lower thancomparable oenders who had not participated in the drug court program. These programs are also

  • 8/6/2019 Evidence for Social Policy and Practice

    13/36

    3

    ar cheaper than incarceration.In the FY2012 executive budget recently submitted to Congress,President Obama allocated $57 million to support problem-solving courts.

    International interest in the Center or Court Innovation and its demonstration projects is growing.Staers rom the Center have worked with criminal justice reormers in 50 countries, includingaiding the development o new community courts in England, South Arica, New Zealand, Australiaand Canada. Australia has recently launched its own Center or Court Innovation, operating out o

    Monash University. Several organizations (including the Young Foundation and Policy Exchange)have recently issued calls or a Center or Court Innovation in the United Kingdom.

    Endnotes

    1. See Svirido, M. (2000) Dispensing Justice Locally: The Implementation and Eects o the Midtown Community Court.New York: Harwood Academic Publishers.

    2. See Berman, G. and Feinblatt, J. (2005) Good Courts: The Case or Problem-Solving Justice. New York: The New Press.

    3. See: www.courtinnovation.org

    4. For instance Berman, G. and Fox, A. (2010) Trial and Error in Criminal Justice Reorm: Learning rom Failure.Washington: Urban Institute Press.

  • 8/6/2019 Evidence for Social Policy and Practice

    14/36

    This summary has been written by NESTA and unless otherwise indicated is based upon documentsprovided by the Washington State Institute or Public Policy.1

    1. Summary

    This summary covers the development o the Washington State Institute or Public Policy andthe role it perorms currently, as set out in the 2009 Legislature. The Institutes 4-Step approachto research is outlined to show how dierent interventions are evaluated, and how the ndingsgenerated are presented to Washington State to enable research to infuence policy decisions. Thisnote also covers the current work to translate the Institutes model to the UK.

    2. Roles and responsibilities

    The Washington State Legislature created the Washington State Institute or Public Policy (WSIPP)in 1983 to carry out practical, non-partisan research at legislative direction on issues o Stateimportance. A Board o Directors representing the Legislature, the Governor, and public universitiesgovern the Institute and guide the development o all activities. Alongside an in-house researchteam o 13, the Institute collaborates with universities and other experts.2

    The role o the institute has evolved since it was established. In the mid-1990s, the Legislaturedirected the Institute to identiy evidence-based juvenile justice programs that could lower crime.The Institute built its rst benets-cost analytical tool in 1997 to help the Legislature select soundinvestments, identiying several programs that could reduce crime and save Washington taxpayersmoney.3 In subsequent sessions the Legislature used the results to begin a series o policy reorms.4Then in the early 2000s, the Legislature began to direct the Institute to apply the same benet-costapproach to other public policy areas, including education, child welare, adult mental health andsubstance abuse.

    The current areas o sta expertise include education, criminal justice, welare, children and adultservices, health, utilities and general government. WSIPP also collaborates with universities toextend its capacity into other subject areas.

    3. Current policy areas

    The 2009 Legislature directed WSIPP: To calculate the return on investment to taxpayers romevidence-based prevention and intervention programs and policies.The Institute was instructed

    4

    Part 3: Washington State Institute or Public PolicyThis summary was prepared by NESTA

  • 8/6/2019 Evidence for Social Policy and Practice

    15/36

    5

    in the Legislature to produce a: Comprehensive list o programs and policies that improve []outcomes or children and adults in Washington and result in more cost-efcient use o public

    resources.The ollowing public policy areas are ocussed upon:

    Crime Child maltreatment Education

    Employment Housing Mental health

    Public assistance Public health Substance abuse

    The Legislative cycle is two years long, with the next Legislature due in April 2011.5 A ull reportcovering the 2009 period will be published in June 2011, and provide a ranking o programscurrently implemented in Washington, alongside a broader scope o policy options available.

    4. Approach to research and evaluation: The 4-Step Model

    WSIPP conducts research using its own policy analysts and economists, specialists rom universitiesand consultants. Over the last decade, and continuing the current legislature, WSIPP has continued

    to develop and improve its 4-step research approach. This includes the use o a cost-benet modelwhich is used to calculate policy costs against likely economic returns, alongside predicting theimpact o competing investment options. WSIPPs cost-benet model is used as part o a our-stepprogram to identiy what works:

    Step 1: What works

    A systematic review o the available literature, rom the US and elsewhere, is assessed againsthigh standards o scientic evidence to identiy those interventions that have best achieved theoutcomes (and which ones have not). The empirical approach ollows a meta-analytic rameworkto assess systematically the entire research literature on a given topic. The research studies withstrong, credible evaluation designs are prioritised, whilst those with weak design are discarded.

    Step 2: What makes economic sense?WSIPP then calculate the costs and benets or Washington State. A cost-benet analysis isundertaken by answering two questions:

    1. How much does it cost to produce the eect ound in Step 1?

    2. How much is it worth to people in Washington State to achieve the outcome?

    To answer these questions, WSIPP developed an economic model that provides estimates presentedrom three distinct perspectives: the benets that accrue solely to programme participants, thosereceived by the taxpayers, and any other measurable (non-participant and non-taxpayer) benets.The sum o these perspectives provides a Total Washington State view on whether a programmebenets exceed its costs.

    Step 3: Impacts on State-wide outcomes

    The ndings rom Step 1 and Step 2 are then used to prepare Consumer Reports o what works and what does not ranked by benet-cost estimates. In Step 3, the degree to which a portolioo policies is likely to aect big-picture state-wide outcomes such as crime or high schoolgraduation rates produces a ranking o public options to reveal the pros and cons o dierentoptions.6

    Step 4: Assessing risk

    The nal analytical step involves testing the robustness o the ndings. Considerable uncertaintycan exist in any estimate o benets and costs; thereore it is important to understand howconclusions might change when assumptions alter. To test risk, WSIPP perorm a Monte Carlo

    simulation7 in which the key actors used in the calculation are varied to re-estimate the results othe analysis. The purpose is to test the riskiness o decision making to determine the probabilitythat costs would outweigh benets i a particular policy were adopted.

  • 8/6/2019 Evidence for Social Policy and Practice

    16/36

    6

    The cost-benet model has been used to inorm policy decisions or over 13 years and can beadapted or a variety o policy areas.8

    5. Dissemination o ndings

    The Institute produces evaluation reports covering the 4-step model outlined above to showwhat does and doesnt work, and the rate o return on taxpayer investments. An overview o twoevaluations is provided below:

    1. Benets and costs o prevention and early intervention programs or youth9

    This evaluation set out to investigate whether prevention pays; specically i or each dollarthe State spends that there will be greater than one dollar worth o benets generated. Theprogrammes included in the evaluation span education, youth development, child welare,mentoring and youth substance abuse prevention programmes, and were analysed withmonetary values assigned to any observed changes in education, crime, substance abuse,child abuse, teen pregnancy and public assistance outcomes. The report concluded thatthere is credible evidence that well-implemented programs can achieve signicantly morebenet than costs. The programmes identied included Nurse Family Partnership or low

    income women which was ound to generate $2.88 per dollar spent; and Project TowardsNo Tobacco Use which generates $55.84 per dollar spent.10

    2. To reduce uture prison construction, criminal justice costs and crime rates11

    In 2006 it was orecast that Washington State would need two new prisons by 2020, andpossibly a third by 2030. Since a new prisons costs around $250 million to build and aurther $45 million a year to operate, the Washington State Legislature expressed an interestin indentiying alternative evidence-based options that could reduce the uture need orprison beds, save money or state and local taxpayers and contribute to lower crime rates. Toaddress this, the Institute conducted a systematic review o 571 rigorous comparison groupevaluations o adult corrections, juvenile corrections and prevention programmes, mosto which were operating in the US, to estimate the benets and costs o evidence-based

    options. Programmes evaluated include vocational education in prisons, Teen Courts, andAdult Drug Courts. The evaluation concluded that i Washington successully implementeda moderate-to-aggressive portolio o evidence-based options, involving expansion oprogrammes to reach between 20 to 40 per cent o the remaining eligible population, then asignicant level o uture prison costs can be avoided, with taxpayers saving about $2 billionand crime rates reduced.

    6. International infuence: Transerring WSIPPs Model to the UK

    The Institute is currently working with the UK-based Dartington Social Research Unit (SRU)12 totranser Washington State Institutes Economic Model to the UK. SRU will translate the modelinto the UK context by replacing the US data in the model with UK inputs. The rst results will bepresented as a cost-benet analysis o interventions selected in policy areas relevant to child well-being. The programme, supported in the UK by Birmingham and Manchester City Councils, and byGreater London Authority, will result in sotware available to all local and health authorities thatpredict the costs and benets over a childs lietime o competing investments in services.

    Following the translation o the model into child well-being policy and interventions, the ullmodel will then be translated into ten urther policy areas aecting both adults and children.13The nal sotware package will be reely available and enable both central and local governmentcommissioners to generate custom reports based on the resources and needs in their area, andcalculate the costs and benets o competing investing options. As well as showing what works,the sotware is also planned to reveal harmul, and thereore expensive, policies and programmes.14

    The sotware is due to be ready rom January 2012 onwards. Whilst the sotware is beingdeveloped, the Social Research Unit is using existing data in the policy areas o youth justice andchild welare to respond to immediate requests or investment advice.

  • 8/6/2019 Evidence for Social Policy and Practice

    17/36

    7

    The Social Research Unit cited three reasons or selecting and developing the WSIPP model:15

    1. It is cautious in its estimates o potential savings.

    2. It can be consistently applied across a range o policy areas.

    3. It has been proven to infuence major policy decisions, including the shiting o resources

    rom prisons to prevention.

    Endnotes

    1. Washington State Institute or Public Policy (2010) Return on (Taxpayer) Investment: Evidence-Based Options toImprove Statewide Outcomes. Olympia WA: Washington State Institute or Public Policy.

    2. For urther inormation about the Institute, see: www.wsipp.wa.gov

    3. Aos, S., Barnoski, R. and Lieb, R. (1998) Watching the Bottom Line: Cost-Eective Interventions or Reducing Crime inWashington. Olympia WA: Washington State Institute or Public Policy (98-01-1201).

    4. Barnoski, R. (2004) Outcome Evaluation o Washington States Research-Based Programs or Juvenile Oenders.Olympia WA: Washington State Institute or Public Policy.

    5. For urther details on the Washington Legislature process see: www.leg.wa.gov/legislature/Pages/Overview.aspx

    6. The Social Research Unit (2011) An Economic Model to Inorm Investment Decisions made by Central and LocalGovernment. Dartington: The Social Research Unit.

    7. There is no single approach. Instead Monte Carlo simulations (or experiments) involve algorithms that rely uponrepeated random sampling to compute results, typically when it is ineasible to compute an exact result with adeterministic algorithm. See Hubbard, D. (2007) How to Measure Anything: Finding the Value o Intangibles inBusiness. New Jersey: John Wiley & Sons.

    8. Graham Allen MP made repeated reerences to the model in his review o early intervention. Allen, G. (2011) EarlyIntervention: The Next Steps. An independent report to Her Majestys Government. London: Cabinet Oce.

    9. Washington State Institute o Public Policy (2004) Benets and Costs o Prevention and Early Intervention Programsor Youth. Olympia WA: Washington State Institute or Public Policy.

    10. In 2003 US$ dollars.

    11. Washington State Institute or Public Policy (2006) Evidence-Based Public Policy Options to Reduce Future PrisonConstruction, Criminal Justice Costs and Crime Rates. Olympia WA: Washington State Institute or Public Policy.

    12. See: www.dartington.org.uk

    13. The Social Research Unit (2011) An Economic Model to Inorm Investment Decisions made by Central and LocalGovernment. Dartington: The Social Research Unit.

    14. The Social Research Unit (2011) Units Econometric Model Gets Praise. Available at: www.dartington.org.uk/projects

    15. The Social Research Unit (2011) An Economic Model to Inorm Investment Decisions made by Central and LocalGovernment. Dartington: The Social Research Unit.

  • 8/6/2019 Evidence for Social Policy and Practice

    18/36

    8

    1. Introduction

    A. MissionBillions o dollars are spent every year on development policies and programs, but there is relativelylittle rigorous evidence on the true impact these programs have on the lives o the poor. Rigorousscientic evidence on what programs or policies work is hard to come by, in part because it is sodicult to attribute changes in peoples lives to a program, rather than other external actors. Thisscarcity o rigorous evidence on program impact, and the technical language in which the littleevidence that does exist is presented, makes it inaccessible to policymakers, many o whom thenrely on intuition and anecdotal evidence in deciding which programs to und and implement.

    The Abdul Lati Jameel Poverty Action Lab (J-PAL) is a network o 54 aliated proessors whouse randomized evaluations to answer questions critical to poverty alleviation. With oces on vecontinents, J-PAL works to reduce poverty by ensuring that policy is based on scientic evidence.

    Besides undertaking research, J-PAL also helps build the capacity o policymakers to evaluate theimpacts o programs and to widely disseminate the results o eld research to policymakers, to ensurethat policy is based on scientic evidence and eective programs are replicated and scaled up.

    B. J-PALs history

    The Poverty Action Lab was ounded in 2003 by MIT proessors Abhijit Banerjee, Esther Dufo,and Sendhil Mullainathan. The concept behind J-PAL was not only to acilitate a larger number oevaluations by aliated proessors and partners, but also to exponentially increase their impactby promoting the sharing o methods and results. In 2003, NGO Development Innovations, run byaliate Dean Karlan, changed its name to Innovations or Poverty Action and began an ongoingpartnership with J-PAL. J-PAL oces are located at universities and provide access to academicresources, while IPA country oces are not based at universities. In 2005, J-PAL had aliatedproessors at only our universities running 20 evaluations. Since then the number o aliates hasgrown to 54, and the number o projects has increased to more than 250.

    C. Structure o J-PAL

    There are two main components o J-PAL: the network o aliated proessors, and the sta whosupport their work at J-PAL oces. Aliated proessors are based at universities around theworld, where they pursue their own research agendas in diverse areas o development includingeducation, health, nance, environment, governance, and agriculture. In their eld projects,researchers evaluate programs and policies that are being implemented by governments, NGOs,international development organizations, and oundations. Contrary to commonly held perceptionsthat evaluations are orensic act-nding exercises, J-PALs aliates oten work closely with theimplementing organizations to design new and innovative approaches inormed by economic theoryand past research ndings, to help solve intractable problems in development. Researchers bring

    a high degree o rigor, objectivity, and independence to the measurement o program impact, andmany o these evaluations go on to be published in top peer-reviewed academic journals like theQuarterly Journal o Economics and The American Economic Review.

    8

    Part 4: Using impact evaluations to inorm policyIqbal Dhaliwal, Global Director o Policy and Caitlin Tulloch, Policy Analyst,J-PAL, Department o Economics, Massachusetts Institute o Technology (MIT)

  • 8/6/2019 Evidence for Social Policy and Practice

    19/36

    J-PALs regional oces are centers within universities in Arica (University o Cape Town), Europe(the Paris School o Economics), Latin America (Ponticia Universidad Catolica in Chile), and SouthAsia (the Institute or Financial Management and Research), while the headquarters is housed inthe Economics Department o the Massachusetts Institute o Technology. These oces perormthree main unctions: research assistance and management support or local projects, trainingand capacity building or local policymakers in impact evaluations, and policy outreach to regionalorganizations about evidence-based policy and important research results. Regional oces are

    headed by Executive Directors, and technical leadership is provided by Scientic Directors, usuallyresearchers who work intensively in that region.

    J-PALs seven research programs in agriculture, education, environment and energy, nance, health,labor markets, and political economy and governance provide leadership and direction to researchand policy outreach in their respective areas. These programs are led by co-chairs: J-PAL aliateswhose work ocuses on that particular area, and who provide intellectual leadership or researchand outreach within that theme. The programs activities include perorming comprehensive reviewso the literature to identiy what researchers have ound to work in development policy and whatimportant research and policy questions remain. Co-chairs may also raise grants to und moreresearch on those identied questions, provide guidance or cost-eectiveness analysis, or adviseon any question where a deep understanding o the literature is necessary. Together, the executivedirectors and scientic directors o the regional oces and the co-chairs o the seven programs

    constitute the J-PAL Board o Directors. This Board meets annually to discuss common issuesaced by J-PAL aliates and sta working in all regions and thematic areas, and to dene J-PALsbroad strategy. Ongoing supervision, oversight, and strategic guidance to J-PAL sta worldwide isprovided by the Boards Executive Committee, which consists o our permanent members (J-PALsthree Directors and the Global Director o Policy) and three rotating members rom among thealiates on the Board.

    D. Policy outreach

    In 2009, J-PAL created a dedicated Policy Group that works to create new materials or policyoutreach, disseminate knowledge about what works to oundations, NGOs, internationaldevelopment organizations and governments, and works with these organizations to scale upprograms that have been ound to be eective by J-PAL research. There are many channels to

    inorm policy eectively, and J-PALs Policy Group ocuses on a ew o these methods, as describedbelow. This does not mean that these are the only ways to infuence policy, or necessarily the mosteective, but they are the ones that J-PAL believes are best suited to its mission o ensuring thatpolicy is inormed by research, its organizational structure, and the countries in which it works.

    2. Generating evidence to infuence policy

    A. Using randomized evaluations to generate rigorous evidence

    Evaluations that accurately measure the impact o a program on target participants can providepolicymakers with very useul evidence in deciding which programs to und or discontinue.This is done by comparing the outcome o interest (e.g. test scores in schools) o beneciarieswho received a program (the treatment group) to another group (the comparison group) that issimilar in all respects except that it did not receive the program (e.g. ree textbooks or students).Measuring outcomes in this comparison group is as close as one can get to measuring how programparticipants would have ared without the program. Thereore, we can only trust an estimate oimpact when we believe that the treatment and comparison groups were equivalent at the start othe program.

    There are many methods o creating a comparable comparison group, but some methods do abetter job than others. Randomized evaluations are oten considered the gold standard o impactevaluations because they generate a comparison group that is statistically identical to the treatmentgroup, by randomly assigning the targeted population into the treatment and comparison groups.Random assignment prevents possible dierences between the treatment and comparison

    groups, which could arise i, or example, a certain type o person chose to opt into a program,or i either the treatment or comparison groups had very specic unobservable characteristicsthat systematically dierentiated them rom each other. This random assignment is the same

    9

  • 8/6/2019 Evidence for Social Policy and Practice

    20/36

    methodology used in clinical trials to determine who gets a drug versus a placebo when testingthe eectiveness o new medicines. Because members o the treatment and comparison groupsdo not dier systematically at the outset o the experiment in both observable and unobservablecharacteristics, any dierence that subsequently arises between them can be attributed to theprogram rather than to other actors.

    This randomized method o evaluation helps researchers and policymakers identiy programs that

    work and those that do not, so that eective programs can be promoted and ineective onescan be discontinued. In this way, evidence can help policymakers get a bigger eect or theirdevelopment spending, and this in turn may lead to greater support and unding commitments orprograms that have been evaluated and ound to be eective.

    B. Promoting policy-relevant evaluations

    The randomized evaluation methodology has long been used in medical trials as well as socialprograms in the United States, but an important eature o work by J-PAL aliates is that itaddresses key questions that are relevant or the selection and implementation o eective anti-poverty policies in the eld. In many cases this means testing a program or a variation o a programwhich could be scaled up to large numbers o users, but it could just as well mean examining somepsychological or behavioral mechanism that infuences the lives o the poor. J-PAL aliates settheir own research agendas, which are oten driven by issues o interest to policymakers.

    In addition, J-PAL tries to promote active dialogue between researchers and policymakersto identiy the issues that are considered most pressing or both. J-PAL hosts matchmakingconerences to provide opportunities or relationships to be built between researchers andimplementing organizations with programs (or questions) that they need evaluated. J-PAL staidentiy organizations engaged in an area o work rom around the world, and talk to them abouttheir research priorities and programs that they need evaluated. Sta spend extensive timescreening out organizations that do not have specic research questions or would not be ableto adopt the randomized methodology, and then invite the remaining group to a conerence.Simultaneously, researchers whose work ocuses on related questions are invited to attend andgive presentations on their research interests and past partnerships. At the end o the conerence,researchers and policymakers are able to share their priorities or new research and nd common

    questions rom which new evaluations can proceed.

    Another way that J-PAL helps bring the policy perspective into research is through the creationo special initiatives in ocus areas, like the adoption o agricultural technologies or ways toreduce corruption in public programs. These Initiatives are driven by unding organizations,policymakers, and J-PAL aliates identiying key areas or uture research, and inviting proposalsrom researchers working with eld partners to propose evaluations that attempt to answer thesequestions. To encourage dialogue about the most pressing policy questions acing a Latin Americancountry, J-PAL helped set up a commission o academics and policymakers. This commission metseveral times to present and discuss various policy questions, and collaboratively designed anevaluation to examine solutions to these problems, which the government committed to und (boththe program and the evaluation). Similarly, J-PAL aliates have been involved with the creationo evaluation unds and units within state and national governments, to build the capacity (bothintellectual and nancial) o countries to evaluate their own pressing questions.

    C. Partnering in the eld to improve program design

    The close interaction between researchers and implementing partners that takes place during theevaluation process is another key means o infuencing policy. The structure o such a partnershipcan aect the way in which evidence can subsequently be used, because it signicantly helpsdetermine the design o the program being examined and the scale at which it is evaluated. Otenpeople assume that program evaluations are a orensic, ex postevaluation o whether a programworked or not, but a typical J-PAL evaluation begins long beore a program is implemented.Researchers and implementing organizations will have an extensive discussion o the underlyingproblem (e.g. unauthorized absenteeism among government healthcare workers) and variouspossible solutions (e.g. higher salaries or better monitoring), along with the associated theory o

    change (e.g. role o incentives versus penalties in changing behavior and intrinsic motivation).

    0

  • 8/6/2019 Evidence for Social Policy and Practice

    21/36

    As a part o this process, researchers may share the results o previous evaluations, and work withimplementers to try to identiy promising interventions to test. Early small-scale pilots o suchinterventions can urther improve program design by gathering input rom the eld, even beorethe program is launched. Once a program starts, multiple variations o the treatment can helppolicymakers understand which components o the program work best, while qualitative surveysand diaries provide continuous eedback on why some components o the program worked betterthan others. In this way, evaluations build partnerships in the eld that can have a proound impact

    on program design, help organizations question the rationale behind every step o their processes,and become better at rigorously measuring the impact o their programs.

    While partnering on a randomized evaluation always involves signicant ex ante interaction withthe implementer, these partnerships can take many orms. It can be advantageous to partner withgovernments because they oer the chance to implement programs on a large scale in a sustainableway, but it may be harder to convince politicians and civil servants to implement new programsthat may not be well aligned with their interests, or rigorously evaluate others that may have largepotential or benetting their avored constituencies.

    Work with governments can also involve long and cumbersome approval processes and the risk oprograms being orphaned and discontinued when key civil servants are transerred or politiciansare voted out. On the other hand, partnerships with NGOs may limit the program scale, but

    they can be aster and more fexible in implementing new approaches and oer very dedicatedsta. Large oundations tend to ocus on one particular issue, such as health or education, butthey oer highly proessional sta and possible unding or scale ups o successul programs.And international development agencies, though they oten have a narrower geographic ocuscorresponding to their home countrys strategic interests, can bring signicant unding or theprogram and oer the chance to implement at a large scale.

    3. Using evidence to infuence policy

    A. Promoting evidence-based policy

    J-PALs Policy Group spends a signicant amount o time communicating with unders andimplementers about the need or evidence-based policy, stimulating demand by explaininghow evidence can be incorporated in the policymaking process, as well as the pros and cons ovarious kinds o evidence rom dierent types o evaluations. In many cases this can serve as anintroduction to the presentation o specic evidence rom eld projects, but in others it can meanproviding policymakers the tools to evaluate their own programs and to become more inormedconsumers o evidence.

    B. Making research accessible

    One o the most eective ways to promote evidence-based policy is by communicating the resultsrom eld evaluations around the world. However, the results o impact evaluations are most otenpresented in academic working papers or journals, and these papers tend to be written in a verytechnical way which can limit their potential audience among policymakers. One o the primaryresponsibilities o the J-PAL Policy Group is to make research more accessible by extracting themost compelling results rom longer papers and reports and presenting them in non-technicallanguage. There are more than 250 ongoing and completed evaluations by J-PAL aliates, andanother 100 or more are in the process o being launched.

    For all o these evaluations conducted by J-PAL researchers, the Policy Group creates anevaluation summary: a two-page synopsis o the relevant policy questions, the program beingevaluated, and the results o the evaluation.1 These summaries are targeted at a non-academicaudience, while still presenting all relevant inormation. They clearly outline testable researchquestions, explain the context and design o an evaluation, and present the results as clearly aspossible, including the implications these research results have or identiying which policies andprograms are eective. These evaluation summaries are made available on the J-PAL website, and

    are particularly useul or putting together reports o the current research on particular topics: orexample, all J-PAL evaluations examining business training or micro-entrepreneurs or all projectslooking at improving immunization rates.

    1

  • 8/6/2019 Evidence for Social Policy and Practice

    22/36

    Sometimes an evaluation is particularly compelling and policy-relevant, so J-PAL creates anexpanded publication about the project. These briecases, around six pages in length, providea longer summary o the project and allow outreach to a larger audience due to their print run.They expand upon the policy questions that the evaluation addresses, provide more detail onthe program being evaluated and how the evaluation was designed.2 These longer publicationsalso provide additional inormation on the context in which the program was implemented anddiscussion o how the results can be extrapolated to other contexts.

    C. Synthesizing inormation rom multiple evaluations

    In addition to communicating the results o academic evaluations to a non-technical audience, thePolicy Group also synthesizes the results rom multiple evaluations to draw out general lessonsabout what works in a given area o anti-poverty policy. This kind o analysis takes two mainorms: cost-eectiveness analyses and longer printed publications called bulletins.3

    One way to analyze results rom multiple evaluations examining the same policy question orprogram is to combine them in a cost-eectiveness analysis. In simple terms, a cost-eectivenessanalysis calculates the ratio o the amount o eect a program achieves or a given amount ocost incurred, or, conversely, the amount o cost required to achieve a given impact. This ratio,when calculated or a number o alternative programs addressing the same policy goal, conveysthe relative impacts and costs o these alternative programs in an easy and intuitive way.4 However,

    relatively ew studies published in academic journals include cost data on the programs they areevaluating, and what data is available is presented in a wide variety o ormats that do not allow oreasy comparison between programs.

    J-PALs Policy Group collects results and cost data rom programs which aim to achieve one policygoal (or example, reducing diarrheal disease in children), and provides cost-eectiveness analysesas a tool to inorm policymakers about these dierent programs. One challenge in this analysis is tostrike the right balance in the trade-o between a ormat which is easy and intuitive to understand,and a representation o the programs which refects all o the nuances o their implementation andresults.

    The results o these cost-eectiveness analyses, as well as syntheses o evaluations which cannot

    necessarily be quantied in cost-eectiveness analyses, are presented in longer print publicationscalled bulletins. For example, the Policy Group is currently working on a bulletin synthesizing theresults rom ten evaluations which examine the eects o charging a price or preventive healthproducts and other goods that have social benets. From the results o these evaluations, J-PALnds that distributing products or a nominal ee signicantly reduces take-up relative to reedistribution, and does not appear to generate signicant revenue or implementing organizations.Bulletins include inormation on each o the evaluations they cover, and also present generallessons that can be drawn rom the collective body o evidence.

    These bulletins, while longer than an evaluation summary or briecase, are still targeted or apolicy audience and written in accessible language. Because they tend to contain evaluations oprograms rom multiple contexts and where multiple variations have been tested, they are also ableto provide more inormation on the sensitivity o this type o program to dierent contexts andassumptions, such as population density or the cost o program inputs.

    D. Disseminating evidence

    In addition to creating individual evaluations summaries and synthesizing multiple results ora policy audience, the Policy Group also tries to reach out to organizations and policymakersthat have the potential or best aecting development programs and policies based on rigorousevidence. One o the main tools or such evidence dissemination is the J-PAL website, where allevaluation summaries, printed publications, and cost-eectiveness analyses can be ound. Thiswebsite, currently available in English and Spanish and shortly to be available in French, is intendedto be a clearing house or the evidence generated by J-PAL researchers, and to provide the policylessons that have been derived rom that evidence base.

    J-PAL also hosts outreach conerences to communicate the results o its aliates evaluations.These conerences, generally organized around a particular geographic region or research theme,target unders and decision makers who are in a position to use that evidence to eect policy

    2

  • 8/6/2019 Evidence for Social Policy and Practice

    23/36

    changes. A recent example o such a conerence was held in Bihar state in India, to communicateresearch results that could inorm policy in that state. The conerence consisted o presentationsby researchers and the organizations with whom they had partnered to evaluate programs,and ocused on issues as diverse as inormational campaigns to improve voter knowledge, andde-worming schoolchildren to improve absenteeism. In this way it served two purposes: toexplain pertinent results rom eld evaluations, and to provide examples o how researchers andpolicymakers can collaborate to generate useul lessons to improve uture decisions.

    E. Scale ups

    J-PAL also works with implementing agencies like governments, NGOs, oundations, andinternational development organizations who may be interested in replicating or scaling up aparticular program ound to be eective by a randomized evaluation. In such cases, J-PAL worksin close partnership with the implementing organization to understand the underlying problemand the context-specic actors like inrastructure, the severity o the problem, and the capacityo implementing agencies. I this study reveals that the program has potential to be successullyreplicated in the new context, J-PAL works with the implementing organization to design a policypilot that introduces the program in a small area. The aim o the pilot is not only to measure thesize o the impact o the program in this new context, but also to test the delivery mechanismor the program using the new implementing agencys resources. Such a pilot provides useulinormation to help policymakers decide whether or not to scale up the program, and i so then

    what challenges and issues they may need to consider to ensure a successul launch.

    4. Conclusion

    There are many ways to eectively infuence policy and J-PAL does not believe that there is justone right way o doing so. J-PALs strategy is to inorm development and social policy in countriesaround the world by rst using randomized evaluations to generate evidence on the impact oprograms and policies that are most critical or development practitioners, and then eectivelydisseminate this evidence among policymakers. This paper provides an overview o J-PALs policystrategy, and in a more detailed paper to be issued later we hope to illustrate our approach in

    greater detail with actual examples rom our work.

    Endnotes

    1. For an example, see: www.povertyactionlab.org/evaluation/primary-school-deworming-kenya

    2. For an example, see: www.povertyactionlab.org/sites/deault/les/publications/2011.3.4_GPI_Global.pd

    3. For an example, see: www.povertyactionlab.org/sites/deault/les/publications/pissuetwo.pd

    4. For an example, see: www.povertyactionlab.org/policy-lessons/health/child-diarrhea

    3

  • 8/6/2019 Evidence for Social Policy and Practice

    24/36

    Part 5: RANDs Promising Practices NetworkRebecca Kilburn, RAND Corporation

    1. Introduction

    This document summarizes RANDs Promising Practices Network on Children, Families andCommunities. The RAND Corporation operates the Promising Practices Network (PPN), a projectthat presents evidence-based inormation on what works to improve outcomes or children andamilies in the United States. The main goal o this paper is to outline the lessons drawn rom PPNin order to help inorm the exploration o how to best strengthen the evidence base or UK socialpolicy.

    2. Project overview and genesis

    PPN1 provides inormation via the Internet on programs and practices that credible research

    indicates are eective in improving outcomes or children, youth, and amilies. The inormationon the PPN website pertains to children rom the prenatal period to age 18, as well as the amiliesand communities in which they live. This site provides useul inormation to decision makers,practitioners, and program unders who must choose among many possibilities or improving resultsor children, youth, and amilies.

    PPN was ounded in 1998 by our state-level intermediary organizations: Caliornia FoundationConsortium, Colorado Foundation or Families and Children, Georgia Family Connection Partnership(ormerly Georgia Academy), and Missouris Family and Community Trust. These organizationsstarted PPN because they recognized the value o providing better access to evidence-basedinormation on improving outcomes or children and amilies, and they saw an opportunity torealize eciencies by pooling resources to undertake this common activity. The RAND Corporationjoined PPN as the operating partner in October o 2000. RAND is now responsible or alloperations, undraising and marketing. The original Governing Board or the project included RANDplus the our ounding partners. This has been replaced by a Board o Advisors consisting o arepresentative rom each o the Network members.

    3. Mission and distinguishing eatures

    With the input o the PPN Board o Advisors, RAND developed the ollowing PPN mission:

    To be the premier source o evidence-based inormation used by those working to improve

    outcomes or children and amilies.

    4

  • 8/6/2019 Evidence for Social Policy and Practice

    25/36

    These are some eatures o PPN that distinguish it rom other websites and best practices sites:

    Targets policymakers, practitioners, and other decision makers rather than only parents orresearchers.

    Includes only evidence-based inormation screened or objectivity and scientic credibility.

    Provides inormation rom the ull range o pertinent sources, not just rom the organizationssponsoring the site.

    Oers inormation in a orm that the intended users nd useul and easy to understand.

    Covers the breadth o topics related to children, not just one issue such as violence or health.

    Provides ree use to anyone with access to the Internet rather than only being available toMembers.

    4. Audience and products

    PPNs primary product is the website: www.promisingpractices.net. The website is user-riendly,ocusing on providing brie, easy-to-understand inormation or busy non-researchers. The primaryaudience or the website includes policymakers, practitioners, the media, and other decision makerswho may not have research training. Data also indicate that secondary audiences, such as scholars,also use the site, as well as proessors in teaching courses.

    The site content is organized into ve main areas:

    Proven and Promising Programs. Provides brie descriptions o programs that have evidenceshowing they improve outcomes or children and amilies. RAND teams write these descriptionsexpressly or PPN. Hundreds o programs have been reviewed in detail or inclusion on the site,

    and we continue to review programs on an ongoing basis.

    Research in Brie. Provides links to hundreds o easy-to-understand summaries o reliableresearch inormation rom a broad range o sources such as the National Center or EducationStatistics, Child Trends, Oce o Juvenile Justice and Delinquency Prevention, Urban Institute,the journal Pediatrics, and dozens o other sources. Inormation in this section is also organizedby outcome area and is continually updated.

    Resources and Tools. This section includes links to databases, act sheets, screening tools,seminal reports, and a variety o other resources that are among the best research-basedmaterials available on children and amilies. The PPN team reviews material or this new sectionon a continuous basis, and we welcome recommendations or additional resources.

    Expert Perspectives. Leading scientists provide insights into pressing policy issues through viamultimedia, such as videos or archived Webinars, videos, answers to users questions rom our

    Ask the Experteature, or other products.

    Partner Pages. Links to inormation on PPN and other resources that are related to hottopics in child and amily policy in that state or organization. This eature is only available toorganizations that are members o the Network. The member organizations select the hot topics,which are updated on a monthly basis.

    The content in the major sections is organized in several ways to acilitate users nding what theyneed. Much o the inormation is organized by broad Outcome Areas (Healthy and Sae Children,Children Ready or School, Children Succeeding in School, and Strong Families), 13 Indicators (such

    as Children not using tobacco, alcohol or illegal drugs, and Students graduating rom high school),and our types o Topic categories (Age o Child, Type o Outcome Improved, Type o Setting, andType o Service).

    5

  • 8/6/2019 Evidence for Social Policy and Practice

    26/36

    Another product associated with the website is the monthly e-mail newsletter, which is sent toindividuals who sign up. The newsletter announces content that has been added to the site sincethe last newsletter.

    5. Review criteria

    All content posted on the site must meet pre-established evidence criteria, and two PPN teammembers and at least one outside reviewer (can be a RAND or non-RAND researcher) must agreethat the research meets these criteria. All inormation listed on the site must meet the Promisingdesignation at a minimum, and programs listed in the Programs that Work section can be designatedeither Promising or Proven, where the latter designation is reserved or programs with evaluationsmeeting the highest standards o rigor. The primary review criteria are displayed in Table 2. Notethat the complete review criteria include about 20 additional issues that reviewers consider, such aswhether the evaluator was also the program developer and the rate o attrition in longitudinal studies.

    6. Use and marketing

    PPN engages in the ollowing ongoing marketing activities:

    Issuing a monthly e-mail newsletter, which reminds subscribers to visit the site, and usuallyresults in new users as the newsletter gets orwarded to others.

    Distributing hard-copy materials at conerences around the US and ad hoc direct mailings.

    Face-to-ace meetings.

    Optimizing the site so as to maximize hits rom search engines (e.g. Google, Yahoo).

    6

  • 8/6/2019 Evidence for Social Policy and Practice

    27/36

    7

    Table 2: Primary PPN Evaluation Criteria

    Not Listed on Site

    I a program meets any

    o these conditions, it will

    not be listed on the site.

    Program impacts anoutcome that is notrelated to children or theiramilies, or or which thereis little or no evidencethat it is related to a PPNbenchmark (such as thenumber o applications orteaching positions).

    No outcome is changedmore than 1 per cent.

    No outcome change issignicant at less than10 per cent level.

    Study does not use aconvincing comparisongroup. Such as using onlytreatment group beoreand ater comparisons, orexample.

    Program evaluationincludes less than tenin the treatment orcomparison sample.

    Distribution is restricted,or example only tosponsor o evaluation.

    Promising Program

    Program must meet all

    o these criteria to be

    listed as Promising.

    Program may impact anintermediary outcome orwhich there is evidencethat it is associatedwith one o the PPNbenchmarks.

    Change in outcome is morethan 1 per cent.

    Outcome change issignicant at the 10 percent level.

    Study has a comparisongroup, but it may exhibitsome weaknesses, e.g., thegroups lack comparabilityon pre-existing variablesor the analysis doesnot employ appropriatestatistical controls.

    Sample size o evaluationhas more than tenindividuals in bothtreatment and comparisongroups.

    Must be publicly available.

    Proven Program

    Program must meet all

    o these criteria to be

    listed as Proven.

    Program must directlyimpact one o thebenchmarks used on thesite.

    At least one outcome ischanged by 20 per cent,

    0.25 standard deviations,or more.

    At least one outcome witha substantial eect size isstatistically signicant atthe 5 per cent level.

    Study design usesconvincing comparisongroup to identiy programimpacts, including random-control trial (control-experimental design) orsome quasi-experimentaldesigns.

    Sample size o evaluationexceeds 30 individualsin both treatment andcomparison group.

    Must be publicly available.

    Type o

    Inormation

    Type o

    Outcomes

    Aected

    Substantial

    Eect Size

    Statistical

    Signicance

    Comparison

    Groups

    Sample Size

    Availability

    o Program

    Evaluation

    Documentation

    Endnotes

    1. See: www.promisingpractices.net

  • 8/6/2019 Evidence for Social Policy and Practice

    28/36

    Part 6: The Obama Administrations evidence-based social

    policy initiatives: An overviewRon Haskins, Advisory Board and Jon Baron, President,Coalition or Evidence-Based Policy

    Summary

    This paper outlines the Obama Administrations plan to strengthen the evidence base or US socialpolicy.

    The Obama Administration has created the most expansive opportunity or rigorous evidence toinfuence social policy in the history o the US government.1 No president or budget director ora president have ever been so intent on using evidence to shape decisions about the undingo social programs as President Obama and ormer Budget Director Orszag.2 The Obama plan tocreate evidence-based social policy initiatives turns the normal relationship between policy decisionmaking and use o social science evidence on its head. Instead o evidence being on the outsideo the decision making process trying to get in, Obama brings evidence inside rom the beginning.The Administration must still convince others that the use o evidence will improve policymakingand program outcomes, but the argument that evidence deserves a prime role in policymaking is

    being made by people inside the Administration and they are arguing to retain an evidence-basedapproach as a undamental part o the Presidents legislative agenda, rather than ghting romthe outside to insert evidence-based policies into the decision making process.3 Although lessemphasized, the Obama plan or basing program decisions on rigorous evidence can be useul orcutting spending as well as unding new programs.

    Even as early as his inaugural address, the President made it clear that an important goal o hisAdministration would be to expand programs that work and eliminate programs that dont.4 Basedon interviews with Administration ocials and advocates, it is clear that rom the earliest days othe Administration senior ocials at Oce o Management and Budget (OMB) were planningseveral initiatives to advance the use o evidence-based program models and to generate high-quality evidence on new program models. When President Obama took oce, career ocials at theOMB, who are oten the origin o ideas or increasing government eciency, were already involvedin a ormal attempt to encourage ederal agencies to conduct high-quality evaluations o theirprograms. Building on this eort, by the end o the second year o the Obama Administration therewere six evidence-based initiatives underway.5

    In the current age o scal austerity, cuts in many social programs are almost inevitable andopportunities or new program spending will be limited. In this environment, it will be ar better orthe national welare i the President and Congress cut programs that have minimal or no (or evennegative) impacts rather than successul programs or programs that show promise. The nationssocial programs are unlikely to be improved until we learn to enact programs based on evidence-based models, to improve existing programs based on evidence, and to shut down ailing programs,again based on evidence rom high-quality program evaluations. Reliable evidence on programeects can be put to good use both in expanding and cutting programs.

    But even when ineective programs have been identied, it does not ollow that the Administrationor Congress will take action. According to Isabel Sawhill and Jon Baron, since 1990 there have been

    8

  • 8/6/2019 Evidence for Social Policy and Practice

    29/36

    ten instances in which a large-scale ederal social program was evaluated by a scientic researchdesign. In nine o these ten cases (including Job Corps, Upward Bound, 21 st Century CommunityLearning Centers, and Head Start), popular programs were shown to have modest or no impactson their participants.6 So ar, only the evaluation o Head Start has resulted in signicant programchanges, and even here the changes are only in the initial stages o surviving the Washington policymaze and not a single Head Start program has been directly aected yet.7

    These examples show that the ederal government needs to nd a better way to spend moneyon social intervention programs. The Obama initiatives or unding social programs are the mostimportant attempts so ar to nd this better way and could have a major impact on how socialprograms are unded in the uture by elevating the role o program evaluations in programexpansion or contraction. Moreover, i the initiatives work, the average impact o social interventionprograms on the well-being o children and amilies will increase and the nation will be better o.

    Based on several interviews with members o the Obama Administration and others inside andoutside Congress knowledgeable about the Obama initiatives, we think the ollowing outlinecaptures key components o the Presidents evidence-based initiatives (although all the initiativesdo not ollow every component o the outline):

    1. Select an important social problem that would make individual citizens and the nation better

    o i the problem could be successully addressed by social policy.

    2. Identiy model programs addressed to the problem that have been shown by randomizedtrials or other rigorous research to signicantly reduce the problem.

    3. Obtain unds rom Congress to scale up evidence-based programs o this type that attackthe problem in accord with the veried models.

    4. Make the unds available to government or private entities with a track record o goodperormance to replicate the successul model programs and to develop new modelprograms.

    5. Continuously evaluate the projects as they are implemented to ensure they are aithullyimplementing the model program and producing good results.

    The Obama team at the OMB came into oce with strong views on the value o rigorous programevaluation. With a team o powerul OMB ocials ully committed to the value o experimentalevaluations, the Obama Administration lost little time in launching its initiative to expand evidence-based social programs.8 What ollows is an overview o the major characteristics and state o playor the Administrations six evidence-based initiatives. Table 1 provides an overview and comparisono the major characteristics o the six initiatives.

    Home visiting. Home visiting is a service strategy to help amilies in one or more o threedomains: maternal and child health, early childhood development, and amily unctioning.9Several home-visiting model programs have been shown by random-assignment studies toproduce signicant impacts on a variety o parenting behaviors and, less oten, child outcomes.10The impacts on mothers include reduced smoking, increased employment, and improved childrearing practices. The Obama home-visiting initiative, or which Congress has approved $1.5billion over the 2010-2014 period, is awarding unds in a three-stage process. In the rst stage,which has been completed, all states were eligible or a share o unding i they submittedproposals that met Administration requirements, primarily that they present a plan or conductingan assessment o the need or home visiting programs in their state. Forty-nine states, theDistrict o Columbia, and ve territories were awarded unds to enter the second stage. In thesecond stage, states are required to complete their needs assessment and submit the results. Thespecic requirements or the third-stage submission have not yet been published, but it seemspossible that most or all states will receive some money and that the best applications will receivemore money.

    Teen pregnancy prevention. The teen pregnancy prevention initiative has proceeded in almostcomplete accord with the components o the Obama initiative outlined above. Teen pregnancy

    9

  • 8/6/2019 Evidence for Social Policy and Practice

    30/36

    0

    Table 3: The Obama plan or expanding evidence-based programs

    Type o

    Inormation

    AdministeringAgency

    Review o

    Literature

    Amount o

    Awards

    Review Panel

    Selection

    Selection o

    Proposals

    Teen Pregnancy

    Prevention

    Health andHuman Services

    Completed byMathematica

    $100 millionawarded $75 million toreplicate existing

    programs, $25million to testnew strategies

    Panels includedboth expert peerreviewers andederal sta

    75 applicantsawarded grants toreplicate existingprograms; 27grantees awardedgrants to testnew strategies

    i3 (Investing in

    Innovation Fund)

    Education

    Although there wasno ormal reviewo literature, the i3evidence tiers were

    based on a processor reviewingevidence developedby IES over severalyears primarilythrough its workon What WorksClearinghouse andstrengthened.

    Up to $650 millionacross the threetypes o grants(development,

    validation, scale-up)

    315 peer reviewerswere selectedrom 1,400experts in bothsubject matterand research/evaluation;reviewers assignedto panels o ve

    people

    49 applicationschosen as highest-rated. All securedmatching unds(20 per cent ogrant amount)

    Home Visiting

    Health andHuman Services

    Completed byMathematica

    $88 millionawarded in yearone, $1.5 billionover ve years

    Applicationswere reviewedby grantsmanagementocials andprogram sta

    49 stategovernments, DC,and 5 territoriesapplied and wereawarded undingor planning;second stage ounding to ollow

    Social

    Innovation Fund

    Corporation orNational andCommunityService (CNCS)

    None

    $49.2 millionawarded

    A total o 60experts weredrawn rom apool o experts/proessionalsand the CNCSreviewer databaseo 2,300 people;reviewers

    assigned to panelso 2-4 people

    11 granteeschosen; mustdesignatesubgranteeswithin six months

    TAA Community

    College and

    Career Training

    Program

    Departmentso Labor andEducation

    No

    $500 milliona year or ouryears

    Not publicyet. Technicalrevie