Thursday, December 25, 2008

Mass Media Science Reporting with reference to misleading statistics about statins

Making Sense of Science Reporting

By Deborah Howell
Sunday, December 7, 2008; B06

The job of science reporters is to take complicated subjects and translate them for readers who are not scientifically sophisticated. Critics say that the news media oversimplify and aren't skeptical enough of financing by special interests.

That led me to review papers that are to be published soon as part of a project sponsored by the American Academy of Arts and Sciences on how the media cover science and technology, and to interview a half-dozen experts, from scientists to teachers of science writing. Here's my take:

· Look for the evidence. News organizations should give weight to scientific evidence, whether it is about global warming or what the medical establishment says about Lyme disease.

Post science reporter David Brown, who is also a physician, talked about this in a recent speech at the University of Iowa. It will be published next year. "In science, there is a natural tension between evidence and opinion, and evidence always wins. What authority figures have to say about anything in science is ultimately irrelevant. Unfortunately, in a lot of science reporting, as in a lot of reporting in general, that isn't the case.''(my italics)

Science reporters should give readers enough information to judge "the strength of a claim" and report "how the news fits into what's already known about the subject," Brown said. "It isn't always easy to boil down research findings to a few numbers that capture the essence" of a study. "Sometimes it can't be done or can't be done on deadline," he said. So follow-ups are important.

Brown recommends noticing how much space in an article is devoted to describing the evidence of the newsworthiness of the story and how much is devoted to someone telling you what to think about it. "If there isn't enough information to give you, the reader, a fighting chance to decide for yourself whether something is important, then somebody isn't doing his job, or hers."

· Look for context. Are the results preliminary? Does the research conflict with or confirm earlier work? Has it been published in a reputable science journal or been presented at a science meeting?

· Look beyond the lead paragraph and headline. Remember that antioxidants were touted to prevent all sorts of disease; research proved that not to be true. One recent Page 1 story, by veteran Post science reporter Rob Stein, attracted comment and criticism. Stein wrote that a study produced "powerful evidence" that a blood test designed to monitor inflammation could identify "seemingly healthy people who are at increased risk for a heart attack or stroke" and that a widely used statin drug offered "potent protection against the nation's leading killers." The story quoted the study's author and other prominent experts as calling the findings a "breakthrough," a "blockbuster" and "absolutely paradigm-shifting."

The Foundation for Integrative AIDS Research (FIAR) -- which has a stake in the issue because AIDS drugs can raise "bad" cholesterol levels -- said stories about the study reflected "shoddy boosterism for the pharmaceutical industry rather than a careful and balanced analysis."

FIAR Director George M. Carter's chief complaint was that stories emphasized a change in "relative risk" -- a 44 percent fall in the number of heart attacks, strokes and surgical procedures among people taking the statin, compared with those in the placebo group. He said the fact that everyone in the study had an extremely low "absolute risk" for heart problems should have been emphasized more. About 1.36 percent of people taking the placebo suffered a heart attack or stroke; that fell to 0.8 percent among those taking the statin. That means that nearly 97 percent of the people using the drug would not see any benefit, he said.

Stein quoted a skeptic in the ninth paragraph and noted near the story's end that "the actual risk reduction for an individual would be very small, given the relatively low risk for most middle-age people, so that the benefits easily could be outweighed by the costs of thousands more people taking tests, drugs and being monitored by doctors."

Stein said, "While I would have liked to have explored many of the nuances of this study more fully, I feel confident we struck a responsible balance. I think it's crucial to provide readers with both the evidence supporting new claims and enough context and interpretation to help them gauge its significance." Independent experts, he said, concluded the study was "a very well done, very convincing piece of research."

One of the issues in science reporting is that most readers aren't schooled in statistics. Harold Varmus, former director of the National Institutes of Health, recommends looking more deeply into the numbers. "The percentages may be high, but what is the risk of an event in the first place? If the risk is low, there's a much smaller benefit." Varmus, a Nobel laureate, is chief executive of Memorial Sloan-Kettering Cancer Center.

Marcia Angell, a physician and former editor of the New England Journal of Medicine who is now a senior lecturer at Harvard Medical School, said journalists can write "overly dramatic" stories for "gullible" readers. "Everyone has an interest in hyping news of medical research -- the researcher, the institution, reporters. Readers should be very skeptical of new findings. Newspapers are in the business of telling you the news, which needs to be startling or counterintuitive or flies in the face of what we knew. By definition these stories are less likely to be accurate."

Don J. Melnick, professor of conservation biology at Columbia University, said that if a story "doesn't sound newsworthy or front page-worthy, it will be buried or not printed at all. That tends to promote people hyping the research. They have to convince their editors to put it in the paper."

Nils Bruzelius, The Post's science editor, said, "I thought the story and Page 1 play were justified because the potential impact was significant, even as I understand the criticisms. There's an inevitable tension between the desire of reporters and editors to get good play for their stories and the need to avoid hype or overstatement, and we feel this very acutely in dealing with scientific or medical stories, because the advances, even those that prove to be part of something very big, usually come in incremental steps. I've long believed that science and medical stories enter this competition at some disadvantage. I certainly don't have data on this but I suspect that most of the top editors who make the front-page decisions tend to be less drawn to these topics than the average reader because, with a few exceptions, they are a naturally self-selected group who got to where they are by dint of their interest and ability in covering such topics as politics, international relations, war and national security -- not science."

· Who sponsored the research and who makes money from its findings? Angell, a critic of drug companies' influence on medical research, said, "The caveats are at the end [of the story]. The pharmaceutical industry is spreading money everywhere and the researchers have their hands out."

That was true of the statin story. In the last six paragraphs, readers learned that the study was financed by AstraZeneca, which makes the statin Crestor, and that the study's author and his hospital will receive royalties on the blood test that was studied. Drugmakers fund many large medical studies. The story said that the company had no influence over the analysis.

Varmus said there is no mechanism for support or motivation to conduct clinical trials without drug industry money. "Obviously, companies have a vested interest in a good outcome and being truthful and getting answers that won't cause them grief later on," he said. Such trials also must follow Food and Drug Administration regulations.

"It's not new that the industry is the primary source of funding clinical research," Angell said. "What is new is the strings attached and the willingness of medical schools and faculty to accept these strings. They have influence over every detail of clinical trials."

Jonathan Weiner, who teaches science writing at the Columbia University Graduate School of Journalism, said, "It's a very messy, complicated problem. With government funding tight, many doctors rely on industry for funding. People in research medicine can't stay current without going to industry-funded conferences that have the quality of junkets." Weiner wrote "The Beak of the Finch," a book about evolutionary biology that won the Pulitzer Prize for general nonfiction in 1995.

For readers, Brown's best advice is this: "In the end, all that counts is evidence."

A longer version of this column appears online. Deborah Howell can be reached at 202-334-7582 or at ombudsman@washpost.com.

View all comments

http://www.washingtonpost.com/wp-dyn/content/article/2008/12/05/AR2008120502959_pf.html

null

No comments: