I recently read Yuval Harari’s extraordinarily astute piece, Why Fiction Trumps Truth, in the New York Times. Harari is an Israeli historian, philosopher and author of Sapiens: A Brief History of Humankind and Homo Deus: A Brief History of Tomorrow. Here is his opening paragraph:
Many people believe that truth conveys power. If some leaders, religions or ideologies misrepresent reality, they will eventually lose to more clearsighted rivals. Hence sticking with the truth is the best strategy for gaining power. Unfortunately, this is just a comforting myth. In fact, truth and power have a far more complicated relationship, because in human society, power means two very different things.
As a professional philosopher dedicated to the search for truth, I found these words disquieting. The truth doesn’t win out? What exactly does Hariri mean by this?
Harari begins by distinguishing between power “as the ability to manipulate objective realities: to hunt animals, to construct bridges, to cure diseases, to build atom bombs. This kind of power is closely tied to truth. If you believe a false physical theory, you won’t be able to build an atom bomb.”
However, there is another kind of power that
means having the ability to manipulate human beliefs, thereby getting lots of people to cooperate effectively. Building atom bombs requires not just a good understanding of physics, but also the coordinated labor of millions of humans. Planet Earth was conquered by Homo sapiens rather than by chimpanzees or elephants, because we are the only mammals that can cooperate in very large numbers. And large-scale cooperation depends on believing common stories. But these stories need not be true. You can unite millions of people by making them believe in completely fictional stories about God, about race or about economics.
[Another kind of power that has to do with manipulating human beings without any interest in getting them to cooperate. In other words, manipulating them simply to dominate, exploit, or enslave them. This may entail getting them to believe common stories about why they should be dominated, exploited, or enslaved but it might not. You might simply overpower them.]
For Harari this “dual nature of power and truth results in the curious fact that we humans know many more truths than any other animal, but we also believe in much more nonsense.” This is a superb observation. As he puts it:
We are both the smartest and the most gullible inhabitants of planet Earth. Rabbits don’t know that E=MC² , that the universe is about 13.8 billion years old and that DNA is made of cytosine, guanine, adenine and thymine. On the other hand, rabbits don’t believe in the mythological fantasies and ideological absurdities that have mesmerized countless humans for thousands of years. No rabbit would have been willing to crash an airplane into the World Trade Center in the hope of being rewarded with 72 virgin rabbits in the afterlife.
Now according to Harari, fiction has some significant advantages over truth in terms of uniting people. “First, whereas the truth is universal, fictions tend to be local.” Consequently, we don’t distinguish our tribe from foreigners very well with a story about, for example, how yeast causes bread to rise since foreigners might have come to that same conclusion independently. But if you believe that little green gremlins cause bread to rise by their dancing that’s almost certainly an idea that foreigners wouldn’t have. This false but unique idea then serves to unite you with, and be able to identify, your clan.
The second advantage of fiction over truth has to do with the fact that believing outlandish stories is a reliable signal that one is a member of the group. For example, “If political loyalty is signaled by believing a true story, anyone can fake it. But believing ridiculous and outlandish stories exacts greater cost, and is therefore a better signal of loyalty.” Put differently, anyone can believe a leader who tells the truth but only true devotees will believe nonsensical things.
Third, and most importantly, the truth is often painful and disturbing. Hence if you stick to unalloyed reality, few people will follow you.” Consider an American presidential candidate who tells the whole truth about the sordid American history. This may be admirable but it isn’t a viable election strategy.
Of course, if believing fictional stories becomes habitual, if zealots believe only nonsense this may be self-defeating. But Hariri suggests that even fanatics “often compartmentalize their irrationality so that they believe nonsense in some fields while being eminently rational in others.” For example, the Nazis believed a pseudoscientific racial theory to exterminate millions but “when it came time to design gas chambers and prepare timetables for the Auschwitz trains, Nazi rationality emerged from its hiding place intact.”
Or consider how
the Scientific Revolution began in the most fanatical culture in the world. Europe in the days of Columbus, Copernicus and Newton had one of the highest concentrations of religious extremists in history, and the lowest level of tolerance … The luminaries of the Scientific Revolution lived in a society that expelled Jews and Muslims, burned heretics wholesale, saw a witch in every cat-loving elderly lady and started a new religious war every full moon.
Hariri argues that this
ability to compartmentalize rationality probably has a lot to do with the structure of our brain. Different parts of the brain are responsible for different modes of thinking. Humans can subconsciously deactivate and reactivate those parts of the brain that are crucial for skeptical thinking. Thus Adolf Eichmann could have shut down his prefrontal cortex while listening to Hitler give an impassioned speech, and then reboot it while poring over the Auschwitz train schedule.
Consider scientists in their lab who abhor supernatural explanations but attend church on the weekends. (Although there are far fewer such people than we often imagine.)
Hariri also notes that though “we need to pay some price for deactivating our rational faculties, the advantages of increased social cohesion are often so big that fictional stories routinely triumph over the truth in human history.” Thus the choice that is often made between truth or social harmony. Should those in power unite people with some fiction or tell the truth at the cost of societal unity?” His conclusion? “Socrates chose the truth and was executed. The most powerful scholarly establishments in history — whether of Christian priests, Confucian mandarins or Communist ideologues — placed unity above truth. That’s why they were so powerful.”
Brief reflections – I’m not sure of this supposed connection between fiction and social cohesion. Science is an enterprise based on truth and is a cooperative venture. I’m just not sure that fictional stories—about Adam and Eve, Jesus, Mohammed, alien abductions, faked moon landings, flat earth theories, etc.—are necessarily better uniters than truthful ones. Yet fictional, irrational stories unite people as silly religious and political ones show.
I also think the purpose of the lies told by religious and political leaders is usually the more sinister one. Power. Here I think Orwell said it best:
“Power is not a means; it is an end. One does not establish a dictatorship in order to safeguard a revolution; one makes the revolution in order to establish the dictatorship. The object of persecution is persecution. The object of torture is torture. The object of power is power.”