Many people are concerned that if the oceans increase their uptake of carbon dioxide, the result will be ocean acidification that harms fish. But Alex Tabarrok reports in Marginal Revolution on a meta-analysis paper on 91 studies of such harm. The paper found that the bigger and more recent the studies are, the less the effect of ocean acidification on fish is found and in the latest studies it is “negligible.”
The authors of the paper, published in PLOS Biology, write:
“We contend that ocean acidification has a negligible direct impact on fish behavior, and we advocate for improved approaches to minimize the potential for a decline effect in future avenues of research.”
The news is good for fish but not for peer review. The authors suggest that previous studies have been biased:
“Furthermore, the vast majority of studies with large effect sizes in this field tend to be characterized by low sample sizes, yet are published in high-impact journals and have a disproportionate influence on the field in terms of citations.”
And it’s not just in fish biology:
“Similar results have been reported in other areas of ecology and evolution, perhaps most notably in studies regarding terrestrial plant responses to high CO2 .”
Another example. As I’ve been working on the third edition of Invasive Plants again and again I encounter the figure of damage from invasive species of $138 billion a year. That estimate is from a single researcher (David Pimentel) over 15 years ago. It has been well challenged, but I saw over 4,000 citations of his article. They might be critical, but I have yet to see more than a few that criticize his exaggerated conclusions.
Moral of the story: when we see an alarming figure of any kind widely used to support one point of view, look for the source. The source is often one study, often of a small sample, that becomes popular because it confirms a particular bias.