Someone recently asked me how I evaluate whether science- or health-related news stories are inaccurate, misleading, or otherwise full of nonsense.
I hadn’t really organized my thoughts on this before. But I had read some pretty good tips from other science writers – including one by Michelle Nijhuis at Slate and one by Emily Willingham at Forbes.com. And I’ve also been reading the news with a more critical eye recently, since I started reviewing health stories for Health News Review.
So, here’s some stuff I think about when evaluating a story about research. A lot of this is stuff that I do unconsciously, I think. But if something sets off my Spidey Sense, I use this to guide my critical assessment process.
Who wrote it? “Staff reports,” etc., often mean it’s a verbatim rendition of a news release – which in many cases is going to put the rosiest possible spin on the relevant research (or, in some cases, misrepresent the work altogether). [Note: As a PIO, I strive to be honest and aboveboard in my news releases, but I know that many folks don’t. Of course, I would say that.]
Similarly, some news outlets run “sponsored content” (i.e., paid ads) or opinion pieces – and then make it difficult to distinguish articles by reporters from paid ads or biased op-eds. Be wary of both sponsored content and op-eds. After all, they are trying to sell you something or have an axe to grind.
Also, some reporters and bloggers have a clear ideological agenda, or have a track record for being conspiracy theorists, etc. For example, if someone has a track record of arguing that AIDS was manufactured by an industry cabal, you don’t want to think of them as a reliable source for health news.
What’s the news outlet? Some are notoriously unreliable (e.g., Daily Mail). Others frequently have a particular political bias that might lead them to make dubious editorial decisions about science coverage. For example, a news outlet that has a track record of scoffing at climate change is probably not going to offer unbiased news reporting of emerging climate research. The same holds true for many other scientific subjects, from food safety to vaccines to evolution.
What’s the basis of the scientific claims being reported? Does the news story cite scientific research? And, if so, was that research published in a peer-reviewed journal? That’s no guarantee that the research is sound, but it’s better than nothing. Does the news story discuss the quality of the research – addressing its strengths and weaknesses? This is something that Health News Review stresses in its review criteria.
For example, if it’s a health story, does it tell readers whether the results it’s reporting stemmed from an observational study, as opposed to a clinical trial? If so, does it note that an observational study can only show correlation, not causation? How big was the n?
Does the story rely largely on testimonials? If a story talks about a scientific or medical “breakthrough,” and that claim seems to be based solely on enthusiastic quotes by individual researchers or patients, be skeptical. These are anecdotes, not rigorous data. Anecdotes can make for good quotes, but they should be there in addition to robust scientific evidence, not instead of it.
Are there independent sources? Does the story include input from relevant researchers who are not affiliated with the research being covered – and who don’t have a clear conflict of interest? If a story only quotes the researchers who did the work, the reporter didn’t talk to folks who would be more likely to point out the limitations of the work. Similarly, if the reporter talks to third-party experts, but those experts are in a position to profit from the work, readers need to take that into account – especially if the reporter doesn’t note the conflict of interest.
Does the story place the work in context? This is a biggie. Research is never done in a vacuum. The most recent study, in any field, builds on lots of work that has already been done. A good science story will tell you how the new findings build on and are different from previous work. They’ll also tell you what challenges are ahead for the work or what new questions cropped up from the recent findings (new answers always lead to new questions).
If a story is discussing something people can use – a new technology, medical treatment, or other product (e.g., a new type of genetically modified food), the story should tell you how far it is from the marketplace. For example, if it is a new medical treatment, how far along is it in the clinical trial process? Is it in use now or is it years away from any potential widespread use? Similarly, if it’s a story about genetically modified organisms, are the findings based purely on lab work and predicted behavior outside the lab? Or are the organisms already being used commercially?
In addition, stories about research related to new products/services/treatments should also make clear how they differ from existing products/services/treatments. How are they (quantifiably) better? Do they offer different or higher risks? If the story doesn’t tell you, be skeptical.
Are any other news outlets covering the story? This isn’t necessarily a dealbreaker, since news outlets are constantly trying to be the first to report a story. However, if something seems too good to be true (or too bad to be true), check to see if anyone else is covering the story. If they aren’t, be suspicious. And if other outlets are covering it, check to see if the stories are saying the same thing. That’s no guarantee that they’re getting it right (I’ve seen multiple outlets cover the same story and all make the same mistakes), but it makes it more likely that they’re not wildly wrong.
Does the story make far-fetched claims? If a story makes outlandish claims, be very wary. For example, anything that says there is a “conspiracy,” or that “they don’t want you to know” something, usually means that the writers have no evidence for whatever they’re claiming.
Note: I wrote this post before re-reading the pieces by Nijhuis and Willingham. But I did revisit their work after writing this. There is a fair amount of overlap (which is not surprising) – but they each explore areas that I didn’t, and vice versa. So, if this is a subject you’re interested in, it’s still worth reading all three.