Evaluating medical evidence for journalists

35
Evaluating Medical Evidence for Journalists Ivan Oransky, MD Executive Editor, Reuters Health Association of Health Care Journalists Atlanta April 19, 2012

description

How to evaluate medical evidence for journalists, AHCJ 2012

Transcript of Evaluating medical evidence for journalists

Page 1: Evaluating medical evidence for journalists

Evaluating Medical Evidence for Journalists

Ivan Oransky, MDExecutive Editor, Reuters Health

Association of Health Care JournalistsAtlanta

April 19, 2012

Page 2: Evaluating medical evidence for journalists

Can You Trust Journal Studies?• How good is peer review?• Positive publication bias:

Publish a trial that will bring US$100,000 of profit or meet the end-of-year budget by firing an editor. -- Former BMJ editor Richard Smith

• Over-reliance on embargoed studies

• How often it turns out to be wrong

Page 3: Evaluating medical evidence for journalists

Embargoes and the Ingelfinger Rule

By the late 20th century, journals needed to compete not just with each other but with newspapers and other media…In 1969, the Journal articulated this relationship in its Ingelfinger Rule, a policy against publishing anything that had already appeared elsewhere. Other journals followed suit. This rule, combined with embargo policies, has led to a carefully choreographed production in which medical journals and the popular press work cooperatively and competitively to influence the news cycle.

-- NEJM, April 19, 2012

Page 4: Evaluating medical evidence for journalists

Even Without Embargoes, We’d Still Have Ingelfinger

Page 5: Evaluating medical evidence for journalists

How Often Are Studies Wrong?

Ioannidis JPA. PLoS Med 2005; 2(8): e124

Page 6: Evaluating medical evidence for journalists

How Often Are Studies Wrong?

Page 7: Evaluating medical evidence for journalists

Retractions on the Rise

-The Wall Street Journal

Page 8: Evaluating medical evidence for journalists

Retractions on the Rise

-Neil Saunders

Page 9: Evaluating medical evidence for journalists

The Unofficial Record Holder

Page 10: Evaluating medical evidence for journalists

This is Transparency?

Page 11: Evaluating medical evidence for journalists

Conference Pitfalls

Page 12: Evaluating medical evidence for journalists

Conference Pitfalls• Conferences select presenters based on < 1000 words

• Urologists at U of Florida & Indiana U studied 126 randomized controlled trials presented in 2002-2003

Page 13: Evaluating medical evidence for journalists

Conference Pitfalls• RCTs are the “gold standard” of medical evidence

• But the quality of that evidence wasn’t pretty • No abstract said how trial subjects were randomly

assigned to different treatments or placebos• None told how the study ensured that neither the

researchers nor their doctors knew which they got• Only about a quarter said how long researchers

followed the subjects in the trial

Page 14: Evaluating medical evidence for journalists

Just Say No

Sometimes, it’s better not to cover something. But if you must…

Page 15: Evaluating medical evidence for journalists

Always Read the StudyWriting about a study after reading just a press

release or an abstract – without reading the entire paper –

is journalistic malpractice

Page 16: Evaluating medical evidence for journalists

How to Get Studies• www.EurekAlert.org for embargoed material • Association of Health Care Journalists membership

includes access to Cochrane Library, Health Affairs, JAMA, and many other journals www.healthjournalism.org

• ScienceDirect (Elsevier) gives reporters free access to hundreds of journals www.sciencedirect.com

• Open access journals (e.g., Public Library of Science www.plos.org)

• Ask press officers, or the authors

Page 17: Evaluating medical evidence for journalists

How Good Was The Study?• Was it:– Peer-reviewed?– Published? Where?

• Was it in humans?– It’s remarkable there are any mice left with

cancer, depression, or restless leg syndrome

• Size matters

• Was it well-designed?

Page 18: Evaluating medical evidence for journalists

From Covering Medical Research, Schwitzer/AHCJ

Page 19: Evaluating medical evidence for journalists

What’s Your Angle?• Are you trying to help readers, listeners, and

viewers make better health care decisions?

• Covering a study because it has a good business angle, or it’s about a local project, is perfectly OK, but it doesn’t mean readers deserve less evidence and skepticism

Page 20: Evaluating medical evidence for journalists

Who Could Benefit?• How many people have the disease?

• Keep potential disease-mongering in mind

Page 21: Evaluating medical evidence for journalists

How Effective is the Treatment?• Clinically significant endpoints, or surrogates –

does this matter?

• Preventing complications? How many?

• Always remember to quantify results, not just “patients improved”

Page 22: Evaluating medical evidence for journalists

What Are The Side Effects?• Every treatment has them

• Where to look:– Go beyond press releases and abstracts– Look at tables, charts, and results sections

Page 23: Evaluating medical evidence for journalists

Who Dropped Out?• Why did they leave the trial?

• Intention to treat analysis

Page 24: Evaluating medical evidence for journalists

How Much Does it Cost?• If it’s ready to be the subject of a story, someone

has projected the likely cost and market. – At least ask.

Page 25: Evaluating medical evidence for journalists

Who Has an Interest?• Disclose conflicts

• PharmedOut.org

• Dollars For Docs series http://projects.propublica.org/docdollars/

Page 26: Evaluating medical evidence for journalists

Are There Alternatives?• Did the study compare the new treatment to

existing alternatives, or to placebo?

• What are the advantages and disadvantages (and costs) of those existing alternatives?

Page 27: Evaluating medical evidence for journalists

Don’t Rely Only on Study Authors• Find outside sources. Here’s how:

Page 28: Evaluating medical evidence for journalists

Use Anecdotes Carefully• Is the story representative?

• Does the source of the story have any conflicts?

Page 29: Evaluating medical evidence for journalists

Watch Your Language• Lifestyle/diet – are they randomized controlled

trials, or just observational?

• If observational, make the language fit the evidence:

– YES: “tied,” “linked”

– NO: “reduces,” “causes”

Page 30: Evaluating medical evidence for journalists

Absolute vs. Relative Risk• Consider the risk for blindness in a patient with

diabetes over a five-year period

• The risk for blindness is 2 in 100 (2%) in people who get the conventional treatment and 1 in 100 (1%) with a new drug

• The absolute difference is derived by subtracting the lower risk from the higher risk: 2% - 1% = 1%.

From Covering Medical Research, Schwitzer/AHCJ

Page 31: Evaluating medical evidence for journalists

Absolute vs. Relative Risk• Expressed as an absolute difference, the new drug

reduces the five-year risk for blindness by 1%.

• The relative difference is the ratio of the two risks.

• Given the data above, the relative difference is:

1% ÷ 2% = 50%

• Expressed as a relative difference, the new drug cuts the risk of blindness in half.

From Covering Medical Research, Schwitzer/AHCJ

Page 32: Evaluating medical evidence for journalists

Number Needed To Treat• Same concept as number needed to screen

• Can be calculated from absolute risk: – 100/absolute risk difference (as a percentage)

Page 33: Evaluating medical evidence for journalists

A Dirty Little Secret

Keep a biostatistician in your back pocket

Photo by Peyri Herrera, on Flickr

Page 34: Evaluating medical evidence for journalists

Keep Yourself Honest

• Use HealthNewsReview.org

Page 35: Evaluating medical evidence for journalists

Acknowledgement/Contact• Nancy Lapid, Reuters Health

[email protected]: @ivanoransky