Category Archives: Critical Thinking

Psychological Impediments to Good Thinking

“Nothing is so firmly believed as what we least know.” ~ Montaigne

Good reasoning is hard because we are largely non-rational and emotional beings.1 Here are some psychological impediments to good reasoning. Remember that I am not getting paid to share this information about cogent reasoning. I could make more money going on late night TV and peddling nonsense like so many others do. My goal is to educate.

1.  Loyalty, the herd Instinct, and provincialism impede good reasoning 

Loyalty – Our chances of surviving and flourishing increase if we are loyal to our in-group.

Herd Instinct – Keeps our beliefs and actions within boundaries set by group. Wanting status, we pontificate about things we know nothing about, and we don’t tend to change our minds or admit mistakes. We’d rather keep being wrong than admit we were wrong.

Provincialism – We identify with the ideas and behaviors of the in-group. We see things from our in-group’s point of view and reject unpleasant truths about our groups.

2. Prejudice, stereotypes, scapegoats, and partisan mind-sets 

Prejudice – Loyalty and provincialism lead to prejudice against out-groups, and to thinking in terms of,

Stereotypes –  Having bad opinions about individuals or groups that aren’t justified by the evidence. This promotes out-group intolerance, and tolerance of in-group foibles.

Scapegoats – People or groups who we blame for our problems—Jews, Latinos, atheists and today, especially, immigrants. Faced with complex problems, people accept scapegoating and then don’t have to be self-reflective. All of this leads to a

Partisan Mind-Set – Perceiving evidence and judging arguments from our side only. Good thinkers have an open mind about truth; they know a counter argument may be good.

3. Superstitious Beliefs – There are coincidences—bad things happen after mirrors break and astrology columns are occasionally correct, just like broken clocks are right twice a day. But sensible beliefs are based on sufficient evidence, not small or biased samples. Bad things happen on friday the 13th, just like they do on other days.

4. Wishful Thinking & Self-Deception – In general we believe things we want to be true and deny things we find distasteful.

Wishful Thinking –  Believing things we want to be true but which are almost certainly not.

Self-deception – Believing something that at a deeper level we know isn’t true. (This can have tragic consequences. For example think about colonialism.)

5. Rationalization and Procrastination 

Rationalization – A kind of self-deception when we ignore or deny unpleasant evidence so we can feel justified in doing or believing what we want to do or believe. This often leads to

Procrastination – The favoring of immediate gratification over long-term healthy goals. We have a bias for short-term gain over long-term ones.

6. Unconscious psychological strategies, or defense mechanisms

Suppression – We avoid thinking about stressful thoughts and thus avoid anxiety.

Denial – We deny our situation by interpreting it as less stressful.

7. Benefits of Self-Deception, Wishful Thinking, & Denial

Self-deception reduces stress and anxiety. Rejecting doubt for belief may do the same. The placebo effect or fondly recalling a bad past may also be examples of the value of SD. But then while self-deception helps young people fight wars, it also helps leaders send young people to their death.

8. Pseudoscience & the Paranormal 

Scientists generally know what they’re talking about. That’s why cell phones, computers, cars, lights, airplanes, ships, GPS, antibiotics, vaccines, furnaces, air conditioners, TVs, dentistry, rain coats, hiking boots, etc. work. That’s why about 1/2 of all newborns no longer die before age 2, as they did for almost all of human history. That’s why you have clean water from your tap! Yet people still believe in

Pseudoscience – About 1/3 or more Americans—in the 21st century!—believe in each of the following: ghosts, witches, alien UFOs, alien abductions, astrology, and more. Why are people so gullible?

One reason is that although real science produces results, it is esoteric and often tells us what we don’t want to know. It tells us the universe probably arose spontaneously out of nothing, that it is unimaginably large, that it will eventually die, that we are modified monkeys, and that we have cognitive biases against believing such things. But pseudoscience usually tells us positive things. The astrologer and fortune-teller predict good things and the medium tells us we can talk to our dead relatives.

Consider astrology. Why do people believe in it when it has been shown to be worthless? (If your destiny is determined by the star under which you are born, then all born under that star should have the same destiny. But they don’t.) Part of the reason people still accept it is that charlatans have found ways to make such nonsense seem plausible. They make horoscopes so vague that anyone can see themselves in their description.

Or consider paranormal phenomena like  ESP in all its varieties. Summarizing over a century of research, the National Research Council has concluded that there is no scientific justification whatsoever for believing in such things—not a single shred of evidence. Why then do people persist in believing? Research has shown that personal experience plays a large role in such beliefs. For example, a gambler attributes a winning streak to some special power rather than the random fluctuation we call luck.

Premonitions are similar. They are just a coincidence between thoughts and events. If you have enough thoughts, sometime events will correspond to them. People remember when premonitions come true, and forget when they don’t. (This is called confirmation bias.) While pseudoscience remembers successes and ignores failures, science places it hypothesis to severe tests, with independent verification and repetition of observations and experiments. Real science doesn’t rely on coincidence or anecdote.

9. Lack of a Good Sense of Proportion 

The basic reason we have so many cognitive bugs has to do with our evolutionary history. Behavior guided by the intellect is a late evolutionary development compared to responses motivated by desires and emotions which affect the more primitive parts of the brain. (So you may fear flying rather than driving even though the former is much safer. Or you may fear dying from a foreign terrorist attack even though you are infinitely more likely to be killed by a fellow citizen.) Our primitive brains evolved in biological eras when almost nothing was known about cause and effect. Modern science is very recent.

_______________________________________________________________________

1. This is a summary of Chapter 6 of, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life, a great college textbook out of which I have taught many times.)

Do You Have A Right To Your Opinion? Trump & Millions of Illegal Votes

(This article was reprinted in the online magazine of the Institute for Ethics & Emerging Technologies, December 13, 2016.)

Trump: “I believe that cows can jump over the moon.”
Question: “Is that really true?”
Pence: “He has a right to his opinion.”
Conway: That’s just an alternative fact.”

Donald Trump recently tweeted: “In addition to winning the electoral college in a landslide, I won the popular vote if you deduct the millions of people who voted illegally.” Mike Pence defended this false statement by saying: “He’s entitled to express his opinion on that.” (Here is the video.) As someone who has devoted his life to a search for truth, such lying, obfuscation, and bad thinking is painful to watch. I honestly believe that lying is the original source of most human suffering.

This exchange and the recent piece, “A philosophy professor explains why you’re not entitled to your opinion,” reminded me that about two years ago  I wrote a five-part entry on this blog about critical thinking. The first part was titled: “The Basics of Critical Thinking Part 1: You Don’t Always Have A Right To Your Opinion.” Given the new post-truth world we inhabit, I thought it might be wise to post an excerpt from that post. Here it is.

Let’s begin by asking:  Are you always entitled to your own opinion? Consider, for example, that you claim evolution is “just” a theory. I point out that the word theory has a very special meaning in science—it means what normal people means by “true beyond any reasonable doubt.” I explain to you that the “theory” of gravity or relativity or the atom are theories in the scientific sense, and that they bring together millions of observations. I then explain that multiple branches of science converge on evolution—zoology, botany, genetics, molecular biology, geology, chemistry, anthropology, etc. I show you that virtually no legitimate biologist denies evolution. Now suppose your respond, “well I disagree, and I have a right to my opinion.” Is that relevant? No it isn’t! I wasn’t claiming that you didn’t have a right to an opinion, I was showing you that your opinion is wrong. Being entitled to your opinion doesn’t show your opinion corresponds to the facts, it just shows that you believe something.

… Now you do have a right to believe anything you want, no matter how groundless, if by entitled you mean the political or legal interpretation of rights. Free speech allows you to ignorantly profess: “the earth is flat,” or “the moon is made of green cheese.” But you don’t have a right to believe anything if by entitled you mean an epistemic (knowledge, concerned with truth) right. In that sense you are entitled to believe something only if you have good evidence, sound arguments, and so on. This is the distinction that causes difficulty. As a result, many people believe that their opinions are sacred and others must handle them with care. And, when confronted with counterarguments, they don’t consider that they might be wrong, instead they take offense.

To understand why you don’t have an epistemic right to your opinion ask what duty I have that corresponds to your right to hold some opinion. (Having a right, implies that others have a duty to respect it. If you have a right to free speech, I do have the duty to let you speak.) Do I have the obligation to agree with you? Surely not, since supposedly I have a right to my opinion which may be different from yours. Do I have the obligation to listen to you? No, since I can’t listen to everyone, and some people, for example, are just scientifically illiterate. I don’t consult them about physics. Do I have the obligation to let you keep your opinion? Not always. If you don’t see an oncoming car as you start to cross the street, then I ought to try to change your mind about crossing that street, assuming that you don’t want to hit by a car. O,r if you don’t see the rise of an authoritarian regime, I ought to try to change your mind about supporting it. And if someone is really interested in what’s true, they won’t take the presentation of counter evidence as an injury.

Of course many persons aren’t interested in what’s true; they just like believing certain things. If pressed about their opinions, they find it annoying and say: “I have a right to my opinions.” There are many reasons for this. Their false belief may be part of their group identity, or they may find it painful to change their minds, or they may be ignorant of other opinions, or they may profit from holding their opinion, etc.

But if someone continues to defend themselves with “I have a right to my opinion,” you can be assured of one thing—they aren’t interested in whether their opinion is true or not.

The Basics of Critical Thinking

Critical thinking is “careful, deliberate determination of whether one should accept, reject, or suspend judgment about a claim and the degree of confidence with which one accepts or rejects it.” (Parker & Moore, Critical Thinking)

The problem is that much of our thinking is biased, distorted, partial, uniformed or prejudiced. Yet the quality of our lives depends on the quality of our thoughts. Bad thinking costs us time, money, and possibly our lives. Good thinking may be profitable and save our time and lives. But good thinking is hard and takes practice.

Cogent (good) reasoning consists of: 1) believable premises; 2) consideration of relevant information, and 3) valid conclusions drawn from those premises.

Believable premises – This assumes we have some well-informed background beliefs about the world so as to determine whether a premise is believable. No relevant info passed over – We need to avoid the temptation to disregard contrary evidence. Valid reasoning – When the premises support the conclusion, or, to put it another way, the conclusion follows from the premises, the reasoning is valid. When the premises are also true, then we have a sound argument.

Some wrong ideas about cogent reasoning – Good reasoning is not relative to people, cultures, religions, etc. (There is no male or female, black or white logic.) When you violate deductive reasoning you contradict yourself; and when you violate inductive reasoning you deny evidence and experience. The way the world works in not relative to people, cultures, religions, etc. Still, self-interest, prejudice, or narrow-mindedness leads people to reason poorly.

Background Beliefs – Background beliefs are crucial to determining whether premises are believable and whether no relevant info has been omitted. “That is why bringing one’s background beliefs to bear often is the most important task in evaluating an argument for cogency… ignorance is not bliss. It just renders us incapable of intelligently evaluating claims, premises, arguments, and other sorts of rhetoric we all are subject to every day.”

Kinds of Background Beliefs – We have beliefs about both facts [whether the St. Louis Cardinals won the 1964 baseball World Series] and values [whether it is a good thing that people play baseball.] Beliefs can also be true or false. We need to constantly examine our background beliefs to weed out false ones. Education [as opposed to indoctrination] helps us acquire true beliefs and rid us of false ones. Beliefs also differ in how firmly they should be believed. “The trick is to believe firmly what should be believed, given the evidence, and believe less firmly, or not at all, what is less well supported by the evidence.”

Worldviews or Philosophies – Children tend to believe what they are told, thus most of us believe, even as adults, what we were told as children. [For example, an almost perfect predictor of a person’s religious beliefs are the beliefs of their parents.] These basic beliefs we might call our worldviews or philosophies. “They tend to be the most deeply ingrained and most resistant to amendment of all our background beliefs.” We work very hard to keep them [so as not to create cognitive dissonance.] It is crucial that our worldviews, if they are to consist of true background beliefs, “contain at least a few modestly well-founded beliefs about important scientific theories.”

Insufficiently Grounded Beliefs – Most people have strongly held beliefs about things about which they know almost nothing. In order to think well then, we must weed out poorly grounded [false] beliefs. It is crucial—if we are to think well—that we have well-founded [true] beliefs to support our worldview since “…worldviews are like lenses that cause us to see the world in a particular way or filters through which we process all new ideas and information. Reasoning based on a grossly inaccurate or shallow worldview tends to yield grossly inaccurate, inappropriate, or self-defeating conclusions…”

Two Vital Kinds of Background Beliefs – Beliefs about human nature, and beliefs about the reliability of information sources.

Science to the rescue

the most accurate information comes from the well-established sciences of physics, chemistry, biology, … the scientific enterprise is an organized, ongoing, worldwide activity that builds and corrects from generation to generation…Absolutely no one, starting from scratch, could hope to obtain in one lifetime anything remotely resembling the sophisticated and accurate conclusions of any of the sciences …

Summary of critical thinking – Critical thinking is higher order of thinking as opposed to lower order thinking. Lower order thinking is 1) unreflective, 2) relies on gut intuition, and 3) is largely self-serving. Higher order thinking is 1) reflective, 2) uses logic and reason to analyze and assess ideas, and 3) is consistently fair.

More specifically critical thinking overcomes the most common tendencies of poor thinking: egocentric and sociocentric thinking.

Egocentric thinking is characterized by ideas like it’s true because: a) I believe it; b) I want to believe it, c) I’ve always believed it, d) it is in my interest to believe it, etc.

Sociocentric thinking refers to the extent persons internalize the prejudices of their society/culture. Such persons: a) uncritically accept that their culture is best; b) internalize group norms without questioning; c) blindly conform to group restrictions; d) ignore the insights of other cultures; e) fail to realize that mass media shapes the news from the point of view of their culture; f) ignore their culture’s history, etc.

In contrast to unreflective thinking, critical thinking is fair-minded and open-minded—to think critically is to reason well. It is the kind of reasoning that is the essential ingredient in solving life’s problems. I have written elsewhere in this blog about good thinking, especially in my recent column “We Fear Thought.” But I would summarize my thoughts on the topic, as I did for generations of university students, by saying—good thinking is an essential ingredient in living well.

[All quotes are from the first chapter of Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life)

The Basics of Critical Thinking Part 5: More Crimes Against Logic

Finishing our discussion of  Crimes Against Logic …

Begging the Question – Begging the question is assuming (usually in the form of a premise) the conclusion we intend to prove. Here are some examples: “Freedom is good for society because it is conducive to the good of the community.” “Chloroform renders people unconscious because it’s soporific.” “The reason that there is a big demand for a Harvard education is because everyone wants to get into the school.” Or  consider this argument:

  • Abortion is unjustified killing;
  • Unjustified killing is murder;
  • Thus, abortion is murder.

Abortion may be murder, but this argument doesn’t show it because it begs the question—it assumes what it’s trying to prove because in the above argument unjustified killing is just another name for murder. Or try this:

  • The Bible says that Yahweh is the one true god.
  • The Bible cannot be mistaken because it is the word of Yahweh.
  • Thus, Yahweh is the one true god.

Yahweh may be the one true god, but this argument doesn’t show it because it begs the question.

Coincidence – Coincidence, regarding events, refers to the appearance of a meaningful connection when there is none. Humans often see patterns where there are just random fluctuations—just listen to post-game sports analysis. And people often assume that what follows from something caused it—that correlation equals causation. (I wore blue jeans and then it rained, thus my jeans caused the rain.) This is called “post hoc propter ergo hoc.”

The only good way to test the causal connection between, for example, taking a drug and getting better is a scientifically controlled experiment. With two similar groups, we test how quickly persons recover with (test group) and without (control group) the drug. Unless carefully conducted experiments show something works,  there is absolutely no reason believe it does.

In fact, our very existence is coincidental, but most of us think we result from a cosmic plan. We might even conclude that gods exist because the likelihood of human life was so improbable. But this is like saying that all lotteries are fixed. Yes, it is extremely unlikely that anyone wins the lottery, but that doesn’t mean it’s fixed when someone wins. It’s extremely unlikely you make four aces playing poker, but that doesn’t mean you cheated when you get them.

Humans are well-known to have cognitive biases. The list on Wikipedia shows nearly 100 well-known and named cognitive biases. For instance, Thomas Gilovich found that most people thought that the sequence, “OXXXOXXXOXXOOOXOOXXOO” looked non-random, when, in fact, it has several characteristics maximally probable for a “random” stream.

Statistics – Something may be statistically true, but the conclusion you draw from those stats is debatable. Do cancer rates go up because of air pollution, chemicals in food, people living longer, some combination of the above, some combination of the above and something else, or something else altogether? Only scientific experiments can sort this out. Moreover, the statistics you hear are often mistaken. For example, you may have heard that people only use 10% of their brains, yet this is false. Consider the following:

  • That 35% of British children live in poverty vastly overstates the case since most of what they need—education, health care, and housing—is provided to all.
  • Even if the stats accurately report what people say they do, you can’t be sure they would actually do what they say they would do.
  • You need to know the source of the stats so as to avoid sample bias. If we ask members of the US Table Tennis Association how many of them enjoy table tennis, we would probably get a figure close to 100%. If we asked starving children in Africa the same question, we would probably get a figure close to 0%.
  • Stats are often just plain wrong and nobody bothers to check them.

Morality Fever – Moral fervor isn’t a refutation of a position.

  • What’s Wicked is False – Just because it’s bad to believe something doesn’t make that belief false. It may be bad to not believe in the gods, but the gods may still not exist.
  • What’s Beneficial is True – Just because it’s beneficial to believe something doesn’t make the belief true. It may be good to believe in the gods, but the gods may still not exist.
  • The Meek Shall Inherit the Earth – Just because someone is a victim of injustice doesn’t mean their opinions are correct. And just because you feel guilty about something you did doesn’t mean the victims of your actions are virtuous.

Conclusion – “If the matter at hand is something you genuinely care about, then you should seek more than ever to believe the truth about it. And rationality is merely that way of thinking that gives your beliefs the greatest chance of being true.”(156)

The Basics of Critical Thinking Part 4: Deceptive Language

Continuing our discussion of  Crimes Against Logic …

Empty Words – Language is often empty, vague and obscure. Still precise terminology (or jargon) is sometimes needed for clarity and precision, as in the sciences. But other times jargon disguises simple ideas under a barrage of verbiage, often to sound impressive.

One way language misleads is with weasel words—words that appear to make little or no change to the content of a statement, but actually drain all or most of the content from the statement. Typical weasel words are may, can, could, might, might, arguably, etc. Other devices for deception are hooray words—justice, life, freedom—and boo words—murder, taxes, Hitler. Politicians love to use such words because then listeners believes the politician shares their concerns. Also the use of quotation marks—to show that what some word means is only alleged—leaves you unsure of the author’s meaning.

Deceptive Language – Language is often used to persuade and confuse people. To see this consider that words have cognitive meaning and emotive meaning. For example the terms bureaucrat, government official, and public servant may not be that different cognitively, but they elicit different emotional reactions. Con artists, advertisers, and politicians manipulate our desires and beliefs by appealing to these emotions.

Recent examples are endless. Why was the name of the US “War Department” changed to “Department of Defense? Do you think that was accidental? If you want to get rid of the “Clean Air Act,” don’t call it the “Dirty Air Act,” call it the “Clear Skies Initiative.” If you want to get rid of health care, don’t call it The Affordable Care Act, call it Obamacare. And don’t say torture, say enhanced interrogation; don’t say insanity, say battle fatigue; don’t say we attacked first, say preemptive action; don’t say occupation forces, say coalition forces; don’t say terrorists, say freedom fighters, don’t say freedom fighters, say terrorists (this is not a typo); don’t say war, say “Operation Desert Shield” or “Operation Iraqi Freedom” or “Operation Awesome!” And in addition to political and military doublespeak, there is legal, bureaucratic, and governmental, and doublespeak—language that deliberately obscures. (To understand this better read George Orwell.)

Inconsistency – I can’t say “every adult in France drinks wine” and then say “every adult in France doesn’t drink wine” without contradicting myself. Both statements can’t be true. Contradictions may be easy to spot if you state them explicitly like this, but often the inconsistency is harder to spot. Suppose I make the following argument:

  • Everything denounced in the Bible should be illegal
  • Abortion is denounced in the Bible, thus
  • Abortion should be illegal

To be consistent, I must denounce and praise everything the Bible denounces and praises. Independent of the fact that there is no clear Biblical prohibition against abortion, if one consistently followed the Bible on moral matters one would have to condemn, often under penalty of death: working on the sabbath, eating shellfish, approaching an altar with poor eyesight, getting haircuts, touch the skin of dead pigs, planting two different crops in the same field, contacting women during menstruation, cursing, rebelling against parents, and more. Thus, to be consistent, you can’t pick and choose to suit your prejudice.

Equivocation – In logic an expression is used equivocally in an argument when it has two different meanings—it is used in one place one way and another way in another place. For example, if I say that clubs don’t hurt because I joined one and I’m fine, whereas you say you were hit by one and they do hurt, then we are equivocating on the use of the word club.

Similarly, the words Mormon or Republican or Marxist have many different meanings. For example, suppose I say that being a Mormon makes you a moral person. Suppose that you respond that Mormons killed a number of people traveling through Utah in the late 19th century. I might then say “but those weren’t real Mormons!” The problem here might be that we are equivocating on the term Mormon; we are using the term differently.

One of us might be referring to the acceptance of Mormon doctrines—Joseph Smith was led by an angel to dig up and interpret gold plates with the use of a magic hat, etc., whereas the other might mean “not being murderers.”  To defend my claim that the killers weren’t real Mormons, I would have to show that being Mormon isn’t just accepting the stories in the book of Mormon, it also involves not murdering. But then I have changed the definition of Mormon. Now it means accepting the story of Smith and not murdering.  Of course on this definition, all it means to say that being a Mormon leads you do good things is to say that being a Mormon leads you to do good things. Needless to say the statement has now been emptied of its significance.

In the above case the definition of Mormon has been changed, and emptied of all meaning. If you do this continually, you can never be refuted. For example, if you were a government who wanted to torture people you could simply change the definition of torture to mean something you don’t do. If government critics say “you do torture by the standards set out in the Geneva Convention” then you could say “we don’t torture,” because by torture we now mean “by our standards which are worse than those conventions.” (Unfortunately such equivocation has awful real world consequences.)

Equivocation is used to deceive people, to make them draw unjustified conclusions. We could use any word—wealthy, criminal, democratic, free, great—to describe a person or a country and mean many different things.