News media are often criticized for exaggerating science stories and deliberately sensationalizing the news. However, researchers argue that sensationalism may begin at the source--the press departments of academic research centers.
The accusation comes from Annals of Internal Medicine, in which researchers reviewed press releases from 20 medical centers. The centers' PR departments had issued an average of a nearly a release each week.
Among 200 randomly selected releases that were analyzed in detail, 87 (44%) promoted animal or laboratory research, of which 64 (74%) explicitly claimed relevance to human health. But the paper later points out that "Two-thirds of even highly cited animal studies fail to translate into successful human treatments."
Among 95 releases about primary human research, 22 (23%) omitted study size and 32 (34%) failed to quantify results. Among all 113 releases about human research, few (17%) promoted randomized trials or meta-analyses. 44% of releases reported on uncontrolled interventions, samples of less than 30 participants, studies with surrogate primary outcomes or unpublished data. 58% of releases lacked relevant cautions that tempered the findings.
The researchers even chastised the exaggerated quote from researchers (although it didn't clarify whether they were making the statements or if the PR staff were somehow spinning quotes.) They concluded that academic press releases often promote research with uncertain relevance to human health without acknowledging important cautions or limitations.
Acknowledged. And mainstream journalists still shoulder some of the burden of knowing all these caveats so they can unspin the press releases to better report medical news.
One solution researchers offered was to issue fewer releases about preliminary research, especially unpublished scientific meeting presentations, to reduce the chance that journalists and the public are misled. Unpublished presentations can change substantially or fail to hold up under subsequent research, and 40% of meeting abstracts and 25% of abstracts that garner media attention are never published as full reports.
The newspaper staff here at ACP can take advantage of some key resources when we choose what to cover. We have clinicians who help with the editing process. Physicians on staff shared with us the same training they give to medical students about how to interpret research and write studies. We have ACP resources on writing and reporting medical statistics at our desks.
But probably our greatest resource has been the readers, who don't hesitate to contact us when they feel our coverage is askew. It's probably the main difference between writing for doctors and writing for the lay public. We have a check and balance in our audience, and the mainstream media doesn't.