“We like fun, poppy science that we can share like gossip, and TV news producers know it.” —John Oliver on Last Week Tonight
Last month, several million cellphone users got a fright when the media reported that radiation from your phone can, indeed, cause cancer, according to a new scientific study.
“Study finds link between cell phones and cancer.” “Study finds cell phone frequency linked to cancer.” “Cell Phone-Cancer Link Seen in Rat Study.”
Maybe you saw those headlines with their implied exclamation points. (The third one might have given you pause.) Or perhaps you saw a less arresting headline, like the one on the site Ars Technica: “Study that found cell phones cause cancer in rats is riddled with red flags”; or the one on Vox.com that nailed the media rather than the scientists: “Seriously, stop with the irresponsible reporting on cellphones and cancer.”
How to Spot an Unreliable Study
Vox took a close look at the study, with help from a professor schooled in research at the Indiana University School of Medicine. Here’s what they found — many of these caveats apply to other studies that are widely trumpeted in the media:
- The research was conducted on rats. Rats are not humans and results don’t always translate.
- This was just one study, and findings from single studies don’t always hold up when replicated.
- The study has not yet been peer reviewed. Once it is, there may be edits.
- Some other studies that have looked at a cancer-cellphone link have found none.
- An important and unexplained finding of the study failed to make it it into most of the press reports: Among the group of rats that was exposed to cellphone-like radiation, only the male rats, not the females, showed a higher incidence of brain and heart tumors.
- Another important finding was also left out or given short shrift: The rats that were exposed actually lived longer than the unexposed rats. This could suggest that if the unexposed rats had lived long as long as the exposed rats, they, too, would have developed tumors.
- The incidence of brain cancer in the exposed group was within normal rage, even if it was somewhat higher than in the unexposed group.
- The likelihood of false positives was high.
- The amount of radiation that the rats were exposed to far exceeds the amount most people get from talking on their phones.
So, the next time you hear about a scary new study (or one that gives you permission to drink lots of champagne so you can ward off dementia), ask yourself: People or rodents — and how many? Peer reviewed? Replicated? Any weird surprises? Any good reason the findings are more reliable than previous, contrary ones? And really, how big a difference was there between the outcomes of the two groups in the study?
Why Researchers Bend Results
The problem, though, is deeper than the media’s laziness in reporting on scientific studies. For many reasons, the studies themselves may be deeply flawed. As John Oliver pointed out recently on Last Week Tonight, funding and tenure pressures can lead researchers to create studies for publication that will yield striking results. If they structured the study differently, the results might not be anything special. In other instances, university marketing departments write press releases about studies conducted at their institutions that amp up the hype factor. Media outlets might read the press release but not the study itself.
We’ll let Oliver fill you in on these and other ways that scientific studies you hear about may fall short. (If you’re not a Last Week Tonight fan, bear with the scatology — it’s worth it.)