It is so easy to accept the first ideas that come along, the first thought to which one is exposed. But so often those ideas are wrong. Time does slow down as speed increases; the earth is curved even if it appears flat; and quantum, relativity, evolutionary, and atomic theories are true even if they are counter-intuitive. A real searcher of truth adopts a scientific mindset which is:
more than a way of thinking. It was a way of being—a weird way of being. You are supposed to have skepticism and imagination, but not too much. You are supposed to suspend judgment, yet exercise it. Ultimately, you hope to observe the world with an open mind, gathering facts and testing your predictions and expectations against them. Then you make up your mind and either affirm or reject the ideas at hand. But you also hope to accept that nothing is ever completely settled, that all knowledge is just probable knowledge. A contradictory piece of evidence can always emerge. Hubble said it best …“The scientist explains the world by successive approximations.”
As a philosopher I would call Hubble an evolutionary epistemologist. The idea is that science typically progresses, not through scientific revolutions as Thomas Kuhn thought, but through a gradual evolution. The successive approximations of science to the truth about the world can be compared to an idea in analytic geometry—an asymptote of a curve is a line such that the distance between the curve and the line approaches zero as they tend to infinity. Science too gets closer and closer to the truth while always remaining provisional, that is, open to future evidence. As Gawande notes:
The scientific orientation has proved immensely powerful. It has allowed us to nearly double our lifespan during the past century, to increase our global abundance, and to deepen our understanding of the nature of the universe. Yet scientific knowledge is not necessarily trusted. Partly, that’s because it is incomplete. But even where the knowledge provided by science is overwhelming, people often resist it—sometimes outright deny it. Many people continue to believe, for instance, despite massive evidence to the contrary, that childhood vaccines cause autism (they do not); that people are safer owning a gun (they are not); that genetically modified crops are harmful (on balance, they have been beneficial); that climate change is not happening (it is).
Nonetheless many people still fear vaccines “despite decades of research showing [such fears] to be unfounded … hundreds of studies have found no link, yet … fears persist. In response, vaccine rates have plunged, leading to outbreaks of measles and mumps that, last year, sickened tens of thousands of children across the U.S., Canada, and Europe, and resulted in deaths.” Part of the reason is that people “don’t see measles or mumps around anymore. [But] they do see children with autism. And they see a mom who says, “My child was perfectly fine until he got a vaccine and became autistic.” How do we dislodge these false beliefs? It is hard.
Now, you can tell them that correlation is not causation. You can say that children get a vaccine every two to three months for the first couple years of their life, so the onset of any illness is bound to follow vaccination for many kids. You can say that the science shows no connection. But once an idea has got embedded and become widespread, it becomes very difficult to dig it out of people’s brains—especially when they do not trust scientific authorities. And we are experiencing a significant decline in trust in scientific authorities.
Studies confirm alarming trends regarding trust in science. Part of the reason is that many factions present themselves as quasi-scientific authorities. Religious groups challenge biological evolution, certain industries challenge climate science, and others reject the medical establishment altogether. “As varied as these groups are, they are all alike in one way. They all harbor sacred beliefs that they do not consider open to question.” To discriminate between science and pseudo-science Gawande identifies five hallmark of pseudoscientists.
They argue that the scientific consensus emerges from a conspiracy to suppress dissenting views. They produce fake experts, who have views contrary to established knowledge but do not actually have a credible scientific track record. They cherry-pick the data and papers that challenge the dominant view as a means of discrediting an entire field. They deploy false analogies and other logical fallacies. And they set impossible expectations of research: when scientists produce one level of certainty, the pseudoscientists insist they achieve another. [And] It’s not that some of these approaches never provide valid arguments. Sometimes an analogy is useful, or higher levels of certainty are required. But when you see several or all of these tactics deployed, you know that you’re not dealing with a scientific claim anymore. Pseudoscience is the form of science without the substance.
How then do we defend science as the best way to explain the world? The problem is that people aren’t swayed by reason and evidence, as science itself has discovered. (A fact I can attest to after 30 years of college teaching. I’ve found that, as the songwriter Paul Simon wrote, “A man hears what he wants to hear and disregards the rest.”)
In 2011, two Australian researchers compiled many of the findings in “The Debunking Handbook.” The results are sobering. The evidence is that rebutting bad science doesn’t work; in fact, it commonly backfires. Describing facts that contradict an unscientific belief actually spreads familiarity with the belief and strengthens the conviction of believers. That’s just the way the brain operates; misinformation sticks, in part because it gets incorporated into a person’s mental model of how the world works. Stripping out the misinformation therefore fails, because it threatens to leave a painful gap in that mental model—or no model at all.
What then do we do? Gawande notes that science itself provides a partial answer. It turns out that providing a narrative of scientific accomplishments is the best way to convince science deniers.
You don’t focus on what’s wrong with the vaccine myths, for instance. Instead, you point out: giving children vaccines has proved far safer than not. How do we know? Because of a massive body of evidence, including the fact that we’ve tried the alternate experiment before. Between 1989 and 1991, vaccination among poor urban children in the U.S. dropped. And the result was fifty-five thousand cases of measles and a hundred and twenty-three deaths.
Gawande also argues that we need “to expose the bad science tactics that are being used to mislead people. Bad science has a pattern, and helping people recognize the pattern arms them to come to more scientific beliefs themselves.” Thus we need to help people to better be able to judge which information to trust. (For example, if you want to understand the truth about biological evolution visit a site like this one from the biology department at UC-Berkeley, rather than the site of a religious group that has a vested interest in misleading you.)
Science is the best method of uncovering truth that we have discovered. It is an organized, systematic, collective, self-correcting project whose errors are slowly eliminated. Look in the cockpit of a jetliner and you see more than a hundred years of the self-correcting nature of science—hence the plane is amazingly safe. Of course science isn’t perfect.
Beautifully organized, however, it is not. Seen up close, the scientific community—with its muddled peer-review process, badly written journal articles, subtly contemptuous letters to the editor, overtly contemptuous subreddit threads, and pompous pronouncements of the academy— looks like a rickety vehicle for getting to truth. Yet the hive mind swarms ever forward. It now advances knowledge in almost every realm of existence—even the humanities, where neuroscience and computerization are shaping understanding of everything from free will to how art and literature have evolved over time.
Gawande also notes that scientific ignorance isn’t the exclusive purview of the uneducated. “The doubting is usually among my most, not least, educated patients. Education may expose people to science, but it has a countervailing effect as well, leading people to be more individualistic and ideological.” Education then doesn’t give anyone special authority on truth, but it does give us an idea of what real truth-seeking is like. “It is the effort not of a single person but of a group of people—the bigger the better—pursuing ideas with curiosity, inquisitiveness, openness, and discipline. As scientists, in other words.”
Gawande concludes by emphasizing the social implications of good thinking. “Even more than what you think, how you think matters. The stakes for understanding this could not be higher than they are today, because we are not just battling for what it means to be scientists. We are battling for what it means to be citizens.” (Consider that nearly half the American population in 2016 is prepared to vote for a Presidential candidate who is an egomaniacal, mentally unstable, proto-fascist, manifestly unfit and unqualified for political office. Yes, many of the candidate’s supporters are racists, bigots, misogynists and xenophobes, but many simply don’t understand that they those who hold political office need qualifications, just as their physicians, attorneys, accountants, dentists, nurses, and professors do. And they don’t understand the threat when unqualified people hold power.)
As we confront climate change, nuclear war, bacteria and viruses, and so many other existential threats, we will survive and flourish only if become better critical thinkers. This can be accomplished partly by education, but in my view ultimately the answer must involve artificial intelligence and intelligence augmentation. We will not survive unless we direct our own evolution. In the meantime we can only hope that the uninformed and misinformed don’t gain too much political influence.