Truth and Power? Commentary on “Why Fiction Trumps Truth,” by Yuval Noah Harari

Yuval Noah Harari cropped.jpg

I recently read Yuval Harari’s extraordinarily astute piece, Why Fiction Trumps Truth, in the New York Times. Harari is an Israeli historian, philosopher and author of Sapiens: A Brief History of Humankind and Homo Deus: A Brief History of Tomorrow. Here is his opening paragraph:

Many people believe that truth conveys power. If some leaders, religions or ideologies misrepresent reality, they will eventually lose to more clearsighted rivals. Hence sticking with the truth is the best strategy for gaining power. Unfortunately, this is just a comforting myth. In fact, truth and power have a far more complicated relationship, because in human society, power means two very different things.

As a professional philosopher dedicated to the search for truth, I found these words disquieting. The truth doesn’t win out? What exactly does Hariri mean by this?

Harari begins by distinguishing between power “as the ability to manipulate objective realities: to hunt animals, to construct bridges, to cure diseases, to build atom bombs. This kind of power is closely tied to truth. If you believe a false physical theory, you won’t be able to build an atom bomb.”

However, there is another kind of power that

means having the ability to manipulate human beliefs, thereby getting lots of people to cooperate effectively. Building atom bombs requires not just a good understanding of physics, but also the coordinated labor of millions of humans. Planet Earth was conquered by Homo sapiens rather than by chimpanzees or elephants, because we are the only mammals that can cooperate in very large numbers. And large-scale cooperation depends on believing common stories. But these stories need not be true. You can unite millions of people by making them believe in completely fictional stories about God, about race or about economics.

[Another kind of power that has to do with manipulating human beings without any interest in getting them to cooperate. In other words, manipulating them simply to dominate, exploit, or enslave them. This may entail getting them to believe common stories about why they should be dominated, exploited, or enslaved but it might not. You might simply overpower them.]

For Harari this “dual nature of power and truth results in the curious fact that we humans know many more truths than any other animal, but we also believe in much more nonsense.” This is a superb observation. As he puts it:

We are both the smartest and the most gullible inhabitants of planet Earth. Rabbits don’t know that E=MC² , that the universe is about 13.8 billion years old and that DNA is made of cytosine, guanine, adenine and thymine. On the other hand, rabbits don’t believe in the mythological fantasies and ideological absurdities that have mesmerized countless humans for thousands of years. No rabbit would have been willing to crash an airplane into the World Trade Center in the hope of being rewarded with 72 virgin rabbits in the afterlife.

Now according to Harari, fiction has some significant advantages over truth in terms of uniting people. “First, whereas the truth is universal, fictions tend to be local.” Consequently, we don’t distinguish our tribe from foreigners very well with a story about, for example, how yeast causes bread to rise since foreigners might have come to that same conclusion independently. But if you believe that little green gremlins cause bread to rise by their dancing that’s almost certainly an idea that foreigners wouldn’t have. This false but unique idea then serves to unite you with, and be able to identify, your clan.

The second advantage of fiction over truth has to do with the fact that believing outlandish stories is a reliable signal that one is a member of the group. For example, “If political loyalty is signaled by believing a true story, anyone can fake it. But believing ridiculous and outlandish stories exacts greater cost, and is therefore a better signal of loyalty.” Put differently, anyone can believe a leader who tells the truth but only true devotees will believe nonsensical things.

Third, and most importantly, the truth is often painful and disturbing. Hence if you stick to unalloyed reality, few people will follow you.” Consider an American presidential candidate who tells the whole truth about the sordid American history. This may be admirable but it isn’t a viable election strategy.

Of course, if believing fictional stories becomes habitual, if zealots believe only nonsense this may be self-defeating. But Hariri suggests that even fanatics “often compartmentalize their irrationality so that they believe nonsense in some fields while being eminently rational in others.” For example, the Nazis believed a pseudoscientific racial theory to exterminate millions but “when it came time to design gas chambers and prepare timetables for the Auschwitz trains, Nazi rationality emerged from its hiding place intact.”

Or consider how

the Scientific Revolution began in the most fanatical culture in the world. Europe in the days of Columbus, Copernicus and Newton had one of the highest concentrations of religious extremists in history, and the lowest level of tolerance … The luminaries of the Scientific Revolution lived in a society that expelled Jews and Muslims, burned heretics wholesale, saw a witch in every cat-loving elderly lady and started a new religious war every full moon.

Hariri argues that this

ability to compartmentalize rationality probably has a lot to do with the structure of our brain. Different parts of the brain are responsible for different modes of thinking. Humans can subconsciously deactivate and reactivate those parts of the brain that are crucial for skeptical thinking. Thus Adolf Eichmann could have shut down his prefrontal cortex while listening to Hitler give an impassioned speech, and then reboot it while poring over the Auschwitz train schedule.

Consider scientists in their lab who abhor supernatural explanations but attend church on the weekends. (Although there are far fewer such people than we often imagine.)

Hariri also notes that though “we need to pay some price for deactivating our rational faculties, the advantages of increased social cohesion are often so big that fictional stories routinely triumph over the truth in human history.” Thus the choice that is often made between truth or social harmony. Should those in power unite people with some fiction or tell the truth at the cost of societal unity?” His conclusion? “Socrates chose the truth and was executed. The most powerful scholarly establishments in history — whether of Christian priests, Confucian mandarins or Communist ideologues — placed unity above truth. That’s why they were so powerful.”

Brief reflections – I’m not sure of this supposed connection between fiction and social cohesion. Science is an enterprise based on truth and is a cooperative venture. I’m just not sure that fictional stories—about Adam and Eve, Jesus, Mohammed, alien abductions, faked moon landings, flat earth theories, etc.—are necessarily better uniters than truthful ones. Yet fictional, irrational stories unite people as silly religious and political ones show.

I also think the purpose of the lies told by religious and political leaders is usually the more sinister one. Power. Here I think Orwell said it best:

“Power is not a means; it is an end. One does not establish a dictatorship in order to safeguard a revolution; one makes the revolution in order to establish the dictatorship. The object of persecution is persecution. The object of torture is torture. The object of power is power.”

Liked it? Take a second to support Dr John Messerly on Patreon!
Become a patron at Patreon!

8 thoughts on “Truth and Power? Commentary on “Why Fiction Trumps Truth,” by Yuval Noah Harari

  1. Hi – in your Brief Reflection when you say that you’re not convinced of the “supposed connection between fiction and social cohesion” – it would be nice if it weren’t true, but the crux of the argument which shows that fiction IS better than truth is here: “anyone can believe a leader who tells the truth but only true devotees will believe nonsensical things.”

    Tribalism is about showing your colours (which may well have something to do with the feeling of safety that comes from knowing you are part of the group) – there is a sense of crossing over, of there being some risk or cost involved in showing loyalty, by speaking a lie – especially perhaps the first time you do it – that does not happen in the normal declarative truth narrative. When we lie, when we follow the leader and say climate change is a Chinese hoax, or French people eat frogs, we’re not expected to be graded according to the truth value of the statement, we’re waving our tribalist banner high for everyone to see. We’ve paid the price and from then on expect to enjoy the rewards of being part of the gang.

  2. “The second advantage of fiction over truth has to do with the fact that believing outlandish stories is a reliable signal that one is a member of the group. For example, ‘If political loyalty is signaled by believing a true story, anyone can fake it. But believing ridiculous and outlandish stories exacts greater cost, and is therefore a better signal of loyalty.’ Put differently, anyone can believe a leader who tells the truth but only true devotees will believe nonsensical thing”

    Plus: the bigger the lie, the more plausible it can be– its size makes it more visualized, comprehended, by susceptible masses.

  3. “[…] in the big lie there is always a certain force of credibility; because the broad masses of a nation are always more easily corrupted in the deeper strata of their emotional nature than consciously or voluntarily; and thus in the primitive simplicity of their minds they more readily fall victims to the big lie than the small lie, since they themselves often tell small lies in little matters but would be ashamed to resort to large-scale falsehoods.
    It would never come into their heads to fabricate colossal untruths, and they would not believe that others could have the impudence to distort the truth so infamously. Even though the facts which prove this to be so may be brought clearly to their minds, they will still doubt and waver and will continue to think that there may be some other explanation. For the grossly impudent lie always leaves traces behind it, even after it has been nailed down, a fact which is known to all expert liars in this world and to all who conspire together in the art of lying.”

    You can easily figure out who wrote it.

  4. The way I see it (no new big ideas here):
    Look at our closest relatives — chimps and gorillas. The health of the individual is less important than the health of the clan. Evolution fixates on maximizing copies of genes, not maximizing individual agency. In the past few million years, a branch of the great apes has stumbled upon the efficiency of the altruistic clan, supreme collector of resources and protector of territory.
    The “god-shaped hole in our hearts” is the pro-social salve that our neurology evolved to convince us there is a “correct” way to live, and that the universe will look down approvingly on us for staying true to that path. Without it, we are prone to anti-social behaviors based on self-interest and existential despair, endangering the cohesion of the clan.
    But our expanding rational minds — initially a useful additional tool improving the effectiveness of the clan — have begun to drop the scales from our eyes. We are in a race with knowledge (now augmented with improving AI): in search of a rational argument for altruism while our greater understanding is revealing a cold universe indifferent to our actions.
    Sociopaths (and near-future AI?) are an antisocial subset of humanity that has weaponized the pro-social tendencies of most others. They manipulate society’s desire to believe in good lies to achieve selfish ends that cannot be disputed rationally.
    Given our predilection for categorizing “us” vs “them”, the threat of a planet-killing asteroid might be necessary to reset our priorities.

  5. “Given our predilection for categorizing ‘us’ vs ‘them’, the threat of a planet-killing asteroid might be necessary to reset our priorities.”

    It just might take an asteroid. Or a big war in the Mideast– or both.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.