A remarkable feature of our current time in history is an increasing distrust of authority, whether the church, the government, or the world of science. It is easy to hypothesize reasons for this distrust, from news of malfeasance to the growth of conspiracy theories on the Internet; but distrust leaves us with very little basis for making public policy. Thus the new book Why Trust Science by geologist and historian of science Professor Naomi Oreskes is both timely and welcome. Our recommendations to improve the NGSS by focusing greater attention on the nature of science are well aligned with Oreskes’ findings.
Perhaps best known for co-authoring the scathing critique of climate denial Merchants of Doubt, Oreskes nevertheless takes her title question seriously. She begins with a historical overview of the philosophy of science. While this essay can be a heavy slog for the non-specialist, it is enlightening to read how thinkers of the past have wrestled with the question of where science’s special authority—and effectiveness—come from. Is it the elevated and disinterested nature of scientists themselves? Does it lie in an internally consistent and universal scientific method? Simple examination of history can demonstrate weaknesses in either formulation.
Partly by examining cases where science has gone right or wrong—the Limited Energy Theory, which held that higher education or a profession would harm a woman’s reproductive faculties; the eugenics movement; the theory of continental drift; resistance to the idea that birth control pills can cause depression; and arguments over the value of flossing our teeth—Oreskes comes up with her own list of five elements. Oreskes calls these elements “pillars” that, when present, make scientific conclusions something we can rely on. The first is consensus: a fringe idea is less trustworthy than one that has been confirmed and widely endorsed by qualified scientists. The next two, method and evidence, line up with what we expect of science and its vetting. But Oreskes adds two more: diversity and values. A diversity of perspectives from qualified members of the scientific community, she suggests, can help prevent or correct the skewed thinking that has led to faulty and biased “science” in the past. Moreover, Oreskes argues that instead of aspiring to a lofty stance of having no values beyond the pursuit of truth, scientists should be up front about their values, for example that we have a moral responsibility to leave a habitable earth to our descendants on the one hand, or that the free market admits of no compromise on the other.
The most entertaining part of the book lies in its five case examples, listed above, which continue into an argument over the value of sunscreen. In each case, Oreskes shows how mistakes that arise can be attributed to neglect of one of her five pillars. She then practices what she preaches by opening her argument to response and critique from five different scientists’ voices. These commentaries approach the problem of trust in science from viewpoints ranging from technology as popular evidence that science “works” to the “replication crisis,” which has led to retractions of published papers and established ideas.
For now, let’s keep an eye on the reasons to trust science that Oreskes has offered. People should trust science when scientific experts on the matter in question, building on evidence and using accepted methods, reach consensus after broad discussion and debate among a diverse group of qualified critics. Conclusions emerging from such science are subject to change—in the same way that Einstein added to and improved Newtonian physics—but it is scientific consensus that provides a firm foundation leading to useful and effective increases in understanding the natural world.
Penny and Andy