“Nothing is so firmly believed as what we least know.” ~ Montaigne
It is hard for people to reason well because they are largely non-rational and emotional beings.1 Here are some psychological impediments to good reasoning. (Remember, my pay as a philosophy professor doesn’t increase by teaching cogent reasoning; there is more money to be made by peddling nonsense. My goal is to educate.)
1. Loyalty, the herd Instinct, and provincialism impede good reasoning
Loyalty – Our chances of surviving and flourishing increase if we are loyal to our in-group.
Herd Instinct – Keeps our beliefs and actions within boundaries set by the group. Wanting status, we pontificate about things we know nothing about, and we don’t tend to change our minds or admit mistakes. We’d rather keep being wrong than admit we were wrong.
Provincialism – We identify with the ideas and behaviors of the in-group. We see things from our in-group’s point of view and reject unpleasant truths about our groups.
2. Prejudice, stereotypes, scapegoats, and partisan mind-sets
Prejudice – Loyalty, and provincialism lead to prejudice against out-groups, and to thinking in terms of,
Stereotypes – Having bad opinions about individuals or groups that aren’t justified by the evidence. This promotes out-group intolerance and tolerance of in-group foibles.
Scapegoats – People or groups who we blame for our problems—Jews, Latinos, atheists and today, especially, immigrants. Faced with complex problems, people accept scapegoating and then don’t have to be self-reflective. All of this leads to a
Partisan Mind-Set – Perceiving evidence and judging arguments from our side only. Good thinkers have an open mind about truth; they know a counter-argument may be good.
3. Superstitious Beliefs – There are coincidences—bad things happen after mirrors break and astrology columns are occasionally correct, just like broken clocks are right twice a day. But sensible beliefs are based on sufficient evidence, not small or biased samples. Bad things happen on Friday the 13th, just like they do on other days.
4. Wishful Thinking & Self-Deception – In general, we believe things we want to be true and deny things we find distasteful.
Wishful Thinking – Believing things we want to be true but which are almost certainly not.
Self-deception – Believing something that at a deeper level we know isn’t true. (This can have tragic consequences. For example, think about colonialism.)
5. Rationalization and Procrastination
Rationalization – A kind of self-deception when we ignore or deny unpleasant evidence so we can feel justified in doing or believing what we want to do or believe. This often leads to
Procrastination – The favoring of immediate gratification over long-term health goals. We have a bias for short-term gain over long-term ones.
6. Unconscious psychological strategies, or defense mechanisms
Suppression – We avoid thinking about stressful thoughts and thus avoid anxiety.
Denial – We deny our situation by interpreting it as less stressful.
7. Benefits of Self-Deception, Wishful Thinking, & Denial
Self-deception reduces stress and anxiety. Rejecting doubt and being credulous may do the same. The placebo effect or fondly recalling a bad past are also examples of the value of self-deception. But remember that while self-deception motivates young people to fight wars (they think they’re invincible) old men use that same self-deception to help them send young people to their death.
8. Pseudoscience & the Paranormal
Scientists generally know what they’re talking about. That’s why cell phones, computers, cars, lights, airplanes, ships, GPS, antibiotics, vaccines, furnaces, air conditioners, TVs, dentistry, raincoats, and hiking boots, work. It’s why newborns no longer die in the first year of life, as about 1/2 of them did for most of human history. That’s why you have clean water from your tap! Yet people still believe in …
Pseudoscience – About 1/3 or more Americans—in the 21st century!—believe in each of the following: ghosts, witches, angels, UFOs, alien abductions, astrology, a geocentric solar system, and more. Why are people so gullible?
One reason is that although real science produces results, it is esoteric and often tells us what we don’t want to know. It tells us the universe probably arose spontaneously out of nothing, that it is unimaginably large, that it will eventually die, that we are modified monkeys, and that we have cognitive biases against believing such things. But pseudoscience usually tells us positive things. The astrologer and fortune-teller predict good things and the medium tells us we can talk to our dead relatives.
Consider astrology. Why do people believe in it when it has been shown to be worthless? (If your destiny is determined by the star under which you are born, then all born under that star should have the same destiny. But they don’t.) Part of the reason people still accept it is that charlatans have found ways to make such nonsense seem plausible. They make horoscopes so vague that anyone can see themselves in their description.
Or consider paranormal phenomena like ESP in all its varieties. Summarizing over a century of research, the National Research Council has concluded that there is no scientific justification whatsoever for believing in such things—not a single shred of evidence. Why then do people persist in believing? Research has shown that personal experience plays a large role in such beliefs. For example, a gambler attributes a winning streak to some special power rather than to statistical fluctuation (or what most call luck.)
Premonitions are similar. They are just a coincidence between thoughts and events. If you have enough thoughts, sometime events will correspond to them. People remember when premonitions come true, and forget when they don’t. (This is called confirmation bias.) While pseudoscience remembers successes and ignores failures, science places its hypothesis to severe tests, with independent verification and repetition of observations and experiments. Real science doesn’t rely on coincidence or anecdote.
9. Lack of a Good Sense of Proportion
The basic reason we have so many cognitive bugs has to do with our evolutionary history. Behavior guided by the intellect is a late evolutionary development compared to responses motivated by desires and emotions which affect the more primitive parts of the brain. (So you may fear flying rather than driving even though the former is much safer. Or you may fear dying from a foreign terrorist attack even though you are infinitely more likely to be killed by a fellow citizen.) Our primitive brains evolved in biological eras when almost nothing was known about cause and effect. Modern science, the only cognitive authority in the world today, is very recent.
1. This is a summary of Chapter 6 of, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life, a great college textbook out of which I have taught many times.)
6 thoughts on “Psychological Impediments to Good Thinking”
I’m inclined to think that most everything illogical about “normal” human behavior was vital to our clan cohesion (and therefore prosperity) during hundreds of thousands of years in hunter-gatherer conditions.
We’ve only spent a few thousand years experimenting with larger organizing states, and we are far from optimized to this new order.
You are correct Len. To paraphrase E. O. Wilson our basic problem is that we have reptilian brains, medieval institutions, and 21st century weapons.
One small thought to add to your many excellent points: the problems you cite are all manifestations of human cognition as it evolved in the Pleistocene. The rationalism that you seek to protect is a civilized artifice, much superior in its ability to deal with the problems of civilization. But the central problem is the clash between our “natural” cognition and our “artificial” cognition.
My suggestion is that we can lessen this disjunction by somehow hitching the artificial rationalism to an element of the older cognition. In my case, this is accomplished through pride. I hitch rationalism to pride by thinking in terms of my personal integrity, which requires much more than “not lying to other people”. For me, integrity is “not lying to myself”. Thus, a purely emotional element of my personality (pride) becomes the mainstay of my rationalism.
The intensity of my dedication to rationalism is merely a reflection of the magnitude of my pride.
You are right about the clash of reptillian brains and the cerebral cortex. Perhaps we can overcome other elements of our primitive brains—lust, greed, aggression, territoriality, etc. by hitching them to integrity or ethics in general. What I think you are getting at though is the connection between evolution and ethics and the dispute between T. H. Huxley and Julian Huxley about the compatibility of the two.
Well reasoned. Well written.
“Loyalty – Our chances of surviving and flourishing increase if we are loyal to our in-group.”
I immediately thought of the tragic examples of Socrates and Spinoza. Thanks for your article!