Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to...

85
Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs Please select your table for an upcoming group activity: Orange tables: New to randomized evaluations Blue tables: Some familiarity with randomized evaluations Green tables: Currently brainstorming/ designing a randomized evaluation 1

Transcript of Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to...

Page 1: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs

Please select your table for an upcoming group activity:

Orange tables: New to randomized evaluationsBlue tables: Some familiarity with randomized evaluationsGreen tables: Currently brainstorming/ designing a randomized evaluation

1

Page 2: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs

Wes YinUCLA Luskin School of Public Affairs, J-PAL affil iate

Anna Spier & Sophie ShankJ-PAL North America

Page 3: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Objectives

• Share our enthusiasm for evaluation– Why evaluate? Why randomize?– What is evaluation?– How do we measure impact?

• Develop intuition for designing impact evaluations and provide opportunities to practice randomized evaluation design

• Share experience designing and running randomized evaluations and answer questions

3

Page 4: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

J-PAL’s mission is to reduce poverty by ensuring that policy is informed by scientific evidence

4

Page 5: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

I. Why Evaluate?

II. What is Evaluation?

III. Measuring Impact

IV. Recent Experiences with RCTs

V. Group Work

Page 6: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

In 2008, the state of Oregon expanded its Medicaid program. Slots were allocated using a random lottery.

Oregon Health Insurance Experiment

6

Page 7: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Clear, credible results Medicaid expansion in Oregon…

Increased the use of health-care services

5.5

8.2

0

1

2

3

4

5

6

7

8

9

10

Control Control + Medicaid Effect

Visit

s to

Doc

tor

7

Page 8: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Clear, credible results Medicaid expansion in Oregon…

Diminished financial hardship

5.5% 1.0%

-5%

0%

5%

10%

15%

20%

25%

30%

Control Control + Medicaid Effect

Cat

astro

phic

Med

ical

Ex

pend

iture

(Per

cent

)

8

Page 9: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Clear, credible results Medicaid expansion in Oregon…

Reduced rates of depression and improved self-reported health, but did not have a statistically significant effect on physical health measures

9

30.0%

20.9%

0.0%

5.0%

10.0%

15.0%

20.0%

25.0%

30.0%

35.0%

Control Control + Medicaid Effect

Depr

essio

n Ra

tes

(per

cent

)

Page 10: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

10

Media response

Page 11: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

I. Why Evaluate?

II. What is Evaluation?

III. Measuring Impact

IV. Recent Experiences with RCTs

V. Group Work

Page 12: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

What is evaluation?

12

Evaluation

Program Evaluation

Impact Evaluation

Page 13: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Program evaluation

13

Evaluation

Program Evaluation

Impact Evaluation

Page 14: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

14

Components of program evaluation

Needs Assessment

Theory of Change

Process Evaluation

Impact Evaluation

Cost-Effectiveness Analysis

What is the problem?

How, in theory, does the program fix the problem?

Is the program being implemented correctly?

What is the causal effect of the program on outcomes?

Given the cost, how does it compare to alternatives?

Page 15: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

I. Why Evaluate?

II. What is Evaluation?

III. Measuring Impact

IV. Recent Experiences with RCTs

V. Group Work

Page 16: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Measuring impact

16

Impact is defined as a comparison between:

What actually happened and

What would have happened, had the program not been introduced (i.e., the “counterfactual”)

Page 17: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

What is the impact of this program?

Time

Posit

ive

Dru

g Te

sts

ImpactCounterfactual

Program starts

17

Page 18: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

What is the impact of this program?

Time

Posit

ive

Dru

g Te

sts

Impact

Program starts

18

Page 19: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

19

The counterfactual

The counterfactual represents what would have happened to program participants in the absence of the program

Problem: Counterfactual cannot be observed

Solution: We need to “mimic” or construct the counterfactual—the comparison or control group

Page 20: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

20

Selecting the comparison group

Idea: Select a group that is exactly like the group of participants in all ways except one—their exposure to the program being evaluated

Goal: To be able to attribute differences in outcomes to the program (and not to other factors)

Page 21: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

21

Why randomize?

• Mathematically, it can be shown that random assignment makes it very likely that we are making an apples-to-apples comparison

Page 22: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

22

Methods as tools

Pre-postSimple

Difference

Difference-in-

DifferenceRegressions Randomized

evaluation

Wrench by Daniel Garrett Hickey from the Noun ProjectScrewdriver by Chameleon Design from the Noun Project

Since randomization requires fewer assumptions, we can be more confident that the impact we estimate reflects the actual impact of the program.

Page 23: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

23

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 24: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

24

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 25: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Problem

25

Uninsured populations face worse health outcomes, higher healthcare costs, and significant financial strain.

Page 26: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

26

Methodology/ Study Design

Assessingfeasibility

Refining design and

piloting

Page 27: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Theory of Change

18

Needs Assessment

Intervention

Outcomes

Assumptions

Poor health outcomes, higher healthcare costs, significant financial strain

Medicaid coverage

Lottery winners enroll in and receive Medicaid

Health care utilizationFinancial strain

Depression and self reported healthA few measures of physical health

Page 28: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

28

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 29: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Research question

29

• What is the intervention?

• What are the primary outcomes?

• Who is your study population?

What is the effect of expanding Medicaid on health, health care costs, and financial strain among low-income adults in Oregon?

Page 30: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

30

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 31: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Medicaid randomization

31

1. Identify eligible participants

through lottery sign-up

2. Hold random lottery 3. Measure outcomes

Provide Medicaid

(75,000)

(30,000)

(45,000)

NoMedicaid

Page 32: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

32

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 33: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

33

Assessing feasibility

• Data sources for primary outcomes?

• Process metrics?

• Sample size?

• How many people in treatment and control groups?

• Whose buy-in do you need? Who are the stakeholders?

Page 34: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview of RCT design

Needs Assessment

Theory of Change

Research Question

34

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 35: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Recent Experiences with RCTs

Wes YinUCLA and J-PAL

Page 36: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Overview

• Keys to developing collaborations

• Examples of recent experiences– Successful RCT collaboration with Covered California

– …and a cautionary tale

• Ongoing RCT opportunities in health care

36

Page 37: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Improves Operations Policy/Science

Feasibility

• Compelling policy question

• Tests based in theory

• Publishable

• Study addresses identifiable and important issue

• Results are actionable

• Capacity• Logistics and

research integrity• Optics of study

and potential results

37

Page 38: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Transparency and buy-in matters

• Buy-in from key stakeholders• “Enthusiastic CEO” is insufficient• All relevant key leadership needs to see the utility • Need sufficient time/staffing capacity • Legal/optics• Risk: failure to implement on time; failure to adhere to protocol or

RCT design; even project abandonment

• Researchers need to be sensitive to operations and institutional goals

• Not just for design, but for results and dissemination• Data security• Risk: design doesn’t permit learning; failure to translate results to

actions to improve operations; surprises in how study and results are framed/disseminated

38

Page 39: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Clarity on RCT nuts and bolts

• Understanding purpose of, and commitment to, RCT design

• Mapping research design, data, tests, and finding to action

• And how different findings will inform policy and/or science

• Sufficient sample size

• Conduct pilot • Helps determine sufficient sample size to measure an impact

• For logistics and capacity

39

Page 40: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Recent Experiences: Covered CaliforniaCovered California

– 1.4 million covered in 2017

– 12 carriers, dominated by big four (Anthem, BS, Kaiser, HealthNet)

– Generally stable premium growth, and robust competition, spotty areas of problems with exit/premium hikes

Active purchaser– Negotiates premiums, entry

– Standardizes plans (with the exception of networks)

– Leans on carriers to maintain network adequacy

40

Page 41: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Challenge facing Covered California

Take-up in Covered CA– ~40% of subsidy eligible don’t enroll– Evidence of even lower enrollment among Latinos and

African-Americans

Plan choice– Some CSR eligible choose “dominated” gold and platinum

plans over enhanced silver

Questions:– What strategies can help consumers take-up plans? – What strategies can help consumers choose plans better?

41

Page 42: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Two studies on consumer information

Renewal study: target current enrollees– Different letter interventions provide different information

about plan options

– One made salient no. of physicians and nearest large hospitals for each plan

– Search costs, especially on network attributes

Take-up study: target uninsured– Different letter interventions provide different information

about deadlines, subsidies, penalties, plan options

42

Page 43: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Renewal study: a cautionary tale

• Week before launch, study pulled

• Carrier sensitivities over accuracy of their reported network data

• Root analysis• Tight timelines meant Plan Management buy-in was not solid

• Study coincided with negotiations with carriers on unrelated matter

• Lack of sensitivity meant no “escape valves” were built into the design (e.g. dropping one RCT arm, only)

43

Page 44: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Take-up study: Success

Sample: ~100,000 households– People who initiated eligibility determination, but did not enroll

5 equal sized treatment arms– Control: no treatment– Basic Reminder Letter: “value of insurance”, deadlines, CC

website/telephone– Subsidy + Penalty (SP): Basic letter, personalized subsidy/penalty

estimates – Price Compare: SP plus comparison of BR/SLVR plans (net price)– Price/Star Compare: SP plus comparison on both net price and

quality rating

Stratified by income bracket, race, and languageLetters in Spanish and English

44

Page 45: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Uniform Front

45

Page 46: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Basic Reminder Letter (Arm 2)

46

Page 47: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Subsidy/Penalty (SP) (Arm 3)

47

Page 48: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Plan Price Compare (Arm 4)

• Plans shown• For CSR eligible, show

only silver plans• All others, show bronze

and silver plans

• Report net-of-subsidy premium

48

Page 49: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Price/Star Compare (Arm 5)

• Plans shown– For CSR eligible, show

only silver plans

– All others, show bronze and silver plans

• Report net premium and star rating

49

Page 50: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

50

Page 51: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

51

Page 52: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

0.5

11.

5kd

ensi

ty a

vg_r

isk_

fact

or

.5 1 1.5 2 2.5 3x

Control Letter Intervention

Full SampleRisk Factor Among the Enrolled

52

Page 53: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

0.5

11.

5kd

ensi

ty a

vg_r

isk_

fact

or

.5 1 1.5 2 2.5 3x

Control Letter Intervention

133 < FPL < 180Risk Factor Among the Enrolled

53

Page 54: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Key takeaways

• Target both general and individualized letters– Reminders are important for everyone– Spanish language important– Information about subsidies beneficial for those at lowest

income, but not at higher incomes, where expectations may be higher

• Sustainable– Administration fees more than pay for total cost of letter

intervention– Marginal person induced tends to be healthier, improving

risk pools—a further benefit of marketing or outreach to improve uptake

54

Page 55: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Challenges in health care

• Consumer choice– Choice increasingly devolved to the patient

• Spending risk– With higher cost sharing, increased inequality, and rising

health care costs, patient faces greater spending and health risk, medical debt

• Provider consolidation means better bargaining, not better care– Scope to improve care coordination

– Point of service care vs. population management

– Inputs into care go beyond medical services

55

Page 56: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

RCT opportunities in health care• Consumer empowerment

– Plan/physician choice, health behaviors, care seeking in response to:• Information, defaults/nudges, financial and non-financial incentives

– Debt management• Provider behaviors

– Prescribing behaviors, protocol adherence, referrals in response to:• Information, defaults/nudges, financial (e.g. P4P) and non-financial

incentives

• System-wide reforms– Payment reform and risk sharing (e.g. ACOs)– Team-based care and care in non-traditional settings (i.e. telemedicine,

in-community food and medical support, transportation services) – Hotspotting: intensive case management for HNHC patients (e.g.

Camden & J-PAL)– Integrating social services (criminal justice, permanent housing,

behavioral health) with medical care

56

Page 58: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Group Work

Needs Assessment

Theory of Change

Research Question

58

Methodology/ Study Design

AssessingFeasibility

Refining Design and

Piloting

Page 59: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Group Work

Orange tables: New to randomized evaluationsBlue tables: Some familiarity with randomized evaluationsGreen tables: Currently brainstorming/ designing a randomized evaluation

59

Page 60: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Appendix

Page 61: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Different Methodologies Can Give Different Estimates of Impact:

Example of Case Management Program Following Heart Failure

Page 62: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

62

1. Pre-Post

Average readmissions (per 100 patients) before the program began

26.4

Average readmissions (per 100 patients) after the program ended

19.3

Impact Estimate: -7.1

Page 63: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

63

2. Simple Difference

Average readmissions (per 100 patients) for people in the program

19.3

Average readmissions (per 100 patients) for people who are not in the program

29.5

Impact Estimate: 10.2

Page 64: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

3. Randomized Evaluation

64

Average readmissions (per 100 patients) for people randomly assigned to the program

25.0

Average readmissions (per 100 patients) for people not randomly assigned to the program

28.2

Impact Estimate: -3.2

Page 65: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

65

Summary

Method Impact Estimate

(1) Pre-Post -7.1*

(2) Simple Difference 10.2*

(4) Randomized Evaluation -3.2*

* Statistically significant at the 5% level

Page 66: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

66

Why randomize?

• Mathematically, it can be shown that random assignment makes it very likely that we are making an apples-to-apples comparison

Page 67: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

When to consider randomization

• When your program is…– Over-subscribed – Being rolled out (“Smart-piloting”)– Expanding (e.g., moving into a new location or service

area)– Adding a new component

67

Page 68: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

68

When is a randomized evaluation the right method?

Consider the…

Existing evidence

Program’s capacity

Size of the target population

Maturity of the program

Existence of reliable data

Child Question by Christopher Smith from the Noun Project

Page 69: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

When to consider randomization

69

New program New service New people New location

Oversubscription Undersubscription Admissions or eligibility

cutoff

Page 70: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

I. Phase-in design

II. Encouragement design

III. Randomization among the marginally ineligible

Page 71: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Phase 0: No one treated yetAll control

71J-PAL | WHEN TO RANDOMIZE

Page 72: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Phase 1: 1/4 treated 3/4 control

72J-PAL | WHEN TO RANDOMIZE

Page 73: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Phase 2: 1/2 treated 1/2 control

73

Page 74: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Phase 3: 3/4 treated 1/4 control

74

Page 75: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

Phase 4: All treated No control (experiment over)

75

Page 76: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

I. Phase-in design

II. Encouragement design

III. Randomization among the marginally ineligible

Page 77: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

77

Randomly assign an encouragement

Page 78: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

78

Compare entire treatment group to the entire control group to measure impact

Page 79: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

I. Phase-in design

II. Encouragement design

III. Randomization among the marginally ineligible

Page 80: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

80

Peop

le

Cutoff

Eligible Ineligible

Bundle of eligibility criteriaMore strict Less strict

Strict eligibility criteria

Page 81: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

81

Peop

le

Cutoff

Eligible Ineligible

Bundle of eligibility criteriaMore strict Less strict

New Cutoff

Relax eligibility criteria

Page 82: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

82

Peop

le

Cutoff

Bundle of eligibility criteriaMore strict Less strict

New cutoffRemain eligible Remain ineligibleNot in study

Study sample

Not in study

Hold lottery among the marginally ineligible

Page 83: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

83

Prior to the expansion

Page 84: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

84

Expansion of the eligibility criteria

Page 85: Using Randomized Evaluations to Build the Evidence Base ... · Using Randomized Evaluations to Build the Evidence Base for Complex Care Programs ... – Inputs into care go beyond

85

Lottery among the “newly” eligible