Association of Health Care Journalists workshop 2015, Santa Clara, Schwitzer & Oransky

Post on 17-Jul-2015

897 views 1 download

Tags:

Transcript of Association of Health Care Journalists workshop 2015, Santa Clara, Schwitzer & Oransky

Lessons from HealthNewsReview.org

Gary SchwitzerPublisher, HealthNewsReview.org

Center for Media Communication & HealthUniversity of Minnesota School of Public Health

@garyschwitzer@HealthNewsRevu

gary@healthnewsreview.org

Association of Health Care Journalists April 23, 2015

In health care news, we have our own version of the Rolling Stone rape

story every day

• “The failure encompassed reporting, editing, editorial supervision and fact-checking.”

• “failing to state where important information had come from.”• “There is a tension between crafting a readable story - a story that flows - and

providing clear attribution of quotations and facts.”• “The editors invested Rolling Stone’s reputation in a single source.”• “The problem of confirmation bias - the tendency of people to be trapped by

pre-existing assumptions and to select facts that support their own views while overlooking contradictory ones - is a well-established finding of social science.”

• Other critics: “The writer wanted it to be true, and the editor and fact-checker failed to push.” – “There’s no substitute for skepticism.”

• HealthNewsReview.org reviews news stories that include claims about interventions: treatments, tests, products, procedures

• We regularly check:Boston GlobeLos Angeles TimesNew York TimesPhiladelphia InquirerUSA TodayWall Street JournalWashington PostAssociated PressBloomberg NewsWebsites of ABC, CBS, CNN, Fox, NBC

NPR health & science pageHealthDayReuters HealthVox.comSlate.comFiveThirtyEight.comBuzzFeed.comWebsites of TIME, Newsweek, U.S. News & World Report

As of last week we also now systematically review health care news releases from many sources

• 13 journalists• 8 AHCJ members

• 13 MDs• 7 PhDs• 5 communication scholars and/or

academic public information officers• 4 public health grad students• 3 women with breast cancer (one w/PhD,

two with Master’s degrees)

Every story & release is analyzed by 3 reviewers, each applying the same 10 criteria

Our criteria: Does the story explain…

• What’s the total cost?

• How often do benefits occur?

• How often do harms occur?

• How strong is the evidence?

• Are there alternative choices?

• Is the condition exaggerated?

• Is this really a new approach?

• Is it available?

• Who’s promoting this?

• Do they have a financial conflict of interest?

69%

66%

65%

61%

57%

Percent unsatisfactory after 1,980 story reviews

Most common flaw:Conveying certainty that doesn’t exist

• Exaggerating effect size – relative not absolute data• Using causal language to describe observational studies• Idolatry of the surrogate: Failing to explain limitations of

surrogate markers/endpoints• Tyranny of the anecdote: telling success stories but rarely

profiling dropouts, dissatisfied, those who choose conservative route or lifestyle change instead of treatment

• Single source stories with no independent perspective• Failing to independently analyze quality of evidence

Exaggerate or emphasize benefits

Ignoreor

Minimize Potential Harms

This is avoidable ignorance

Dr. Richard Lehman has published reviews of journal articles for 8 years on BMJ website

“I too was once a conclusion-of-the-abstract reader, and was quite smug that I had even got that far. It took me some years to become aware of perhaps the most important principle of critical reading: never believe the stated bottom line without confirming it from the data. And beware of the limitations of the data.”

Advice from a crusty, but trusted senior colleague“You have been doing yeoman work fighting multiple battles. Is it time to tackle the larger war? What constitutes evidence? A case report? A journal publication? A presentation?

The tenor of the times proclaims that we are practicing evidence-based medicine despite the fact that the level of evidence is scant. Many studies have high risk of bias & the strength of evidence is weak.

You are proposing some variant of educational immunization to help journalists resist the poor information they receive.

Another strategy is to address fundamental problems. Should not journal editors be urged to improve the review process to weed out the large numbers of low quality work? Should we do more to expose the low level of evidence?”

gary@healthnewsreview.org

@garyschwitzer@healthnewsrevu

Thanks to the Laura and John Arnold Foundation for grant support

But It Was Peer-Reviewed – It Must Be Right!

AHCJ 2015Santa Clara

Ivan OranskyVice President, Global Editorial Director,

MedPage TodayCo-Founder, Retraction Watch

@ivanoransky

Most Of What You Read Is False

Most Of What You Read Is False

But It Was Peer-Reviewed!

But It Was Peer-Reviewed!

“Fifty-three papers were deemed 'landmark' studies. It was acknowledged from the outset that some of the data might not hold up, because papers were deliberately selected that described something completely new, such as fresh approaches to targeting cancers or alternative clinical uses for existing therapeutics. Nevertheless, scientific findings were confirmed in only 6 (11%) cases. Even knowing the limitations of preclinical research, this was a shocking result.”

Faked Peer Reviews

Retractions on the Rise

http://pmretract.heroku.com/byyear

Which Journals Retract?

-Infection and Immunity 2011

What Scientists Are Doing About It

What Scientists Are Doing About It

What Scientists Are Doing About It

http://blog.scienceexchange.com/

What Scientists Are Doing About It

http://centerforopenscience.org/

What The Government Is Doing About ItThe White House Office of Science and Technology Policy

What The Government Is Doing About ItThe White House Office of Science and Technology Policy

Contact Info/Acknowledgements

ivan-oransky@erols.com

http://retractionwatch.com

@ivanoransky

Thanks:

The MacArthur Foundation

Nancy Lapid, Reuters Health