Category Archives: Critical Thinking

Fox News: An Information Ecosystem For Fallicious Reasoning

Fox News Channel logo.svg
© Darrell Arnold, Ph.D. – (Reprinted with Permission)  http://darrellarnold.com/2018/07/16/fox-news-an-information-ecosystem-for-fallacious-reasoning/

Fox news has long functioned in the service of America’s conservative movement. What is particularly concerning is their willingness to do so even if it undermines truth and sound reasoning. One way that it regularly undermines truth and clear reasoning is through subtext in the framing of their stories. Another is through editorial decisions on what stories to feature — not the framing of their stories but their framing of their news environment. This creates a kind of informational ecosystem that facilitates fallacious reasoning for ideological purposes. As an example of this latter issue, on their webpage on the day of the Helsinki Summit, after many hours they finally featured a story about “bipartisan backlash.” Yet one of their other featured stories of the day was about — you guessed it — Hillary and Bill Clinton. The day’s lead story thus dealt with the issue of the day. The other though didn’t but served as a reminder for the Fox viewer of all the “Clinton scandals.”

Fox’s decisions in this instance, like on so many others, borders on serving as propaganda, and the informational ecosystem they create facilitates fallacious reasoning in service to their ideological agenda. By selective editorial placement a modern media company like Fox will not exactly commit the Tu Quoque fallacy, sometimes known as Whataboutism (though they do this often in enough of their commentary) — but it can certainly facilitate it among its audience, or create an environment for it.

Whataboutism is a version of the Tu Quoque fallacy. Tu Quoque means “you too” or “who’s talking?” The fallacy in its various forms functions as a diversion, directing people’s attention away from the issue under discussion by focusing on how those making a particular accusation are themselves guilty of the same kind of thing they are accusing others of. Whataboutism in a context of a political scandal will often draw attention away from a scandal by pointing out that an opposing political party also has committed scandals that are just as bad or worse.

In our given example, on the one hand, we have an extraordinarily serious scandal of a sitting U.S. president undermining his own intelligence agencies and the preceding U.S. administration, with no apparent reason other than the fact that Vladimir Putin very strongly protested the charges. Indeed, the event was serious enough that Senator John McCain called it “one of the most disgraceful performances of an American president in memory.” John Brennan, former director of the Central Intelligence Agency, said “Donald Trump’s press conference … rises to and exceeds the threshold of ‘high crimes and misdemeanors.’” Various Democrats also clearly condemned the president’s actions at the summit.

The bipartisan reaction to Trump’s performance is highlighted in the main story of the news of the day. On the other hand, we have the “scandals” about the Clintons, which have been thoroughly investigated but already dismissed. In the case of the Hillary “scandals,” for example, the Benghazi Affair was Congress’s longest investigation, costing over 7 million dollars. The Email controversy was investigated by various bodies over numerous years. In both cases, no criminal wrongdoing was found. But a new story about the Clintons is placed a story or two under the lead story of the day. So we have the Trump issue presented, but the Fox audience is at the same time presented with the question: What about Hillary? What about the Clintons?

Fox is facilitating the Whataboutism fallacy as it subtly suggests a false equivalency between a very real scandal of the day and a trumped up one. This is something they have done regularly over the years and continue to this day, bringing up both of Hillary’s controversies on a regular basis. Through their editorial decisions about what stories to feature and highlight (and to run regardless of credibility) and then by decisions about how to place these in proximity to other stories, media companies and information providers like Fox can serve the purpose of facilitating fallacious reasoning, and in the case of Fox to do so for reasons of ideology.

Though this observation here focuses on one story, an observer of Fox will easily find that this is going on regularly. Add to it the regular commentary of certain media figures and the regularity with which it brings up stories with little credibility at all, and we can see that Fox in particular is essentially an information ecosystem that continually nudges toward fallacious reasoning for ideological purposes. It has long been doing this at the service of the Republican’s moves to undermine democratic norms. Now it is apparently pleased to assume the role in support of an administration with clear authoritarian characteristics.

Cognitive Bias

Chimp Brain in a jar.jpg

I previously linked to a graph of all known cognitive biases and I have recently encountered a short BBC video which nicely captures four of them. It can be found at:
https://www.bbc.com/ideas/videos/is-your-brain-your-own-worst-enemy/p068vb09

What I found of particular interest was the G.I. Joe fallacy which refers roughly to the idea that knowing is half the battle. But in fact, knowing about, for instance, our cognitive biases doesn’t help much in overcoming those biases. Thus some scholars chose the idea that knowing is half the battle as an idea that must be retired.

As a philosopher, I would say that the G.I. Joe fallacy shows, among other things, that Socrates and Plato were mistaken in believing that knowledge is sufficient for virtue. As Aristotle knew there is a large gap between knowing the right thing and doing it. In a sense this is depressing. Even if we know that our brains bias us in multiple ways that seems to help little in overcoming those biases. The gap between knowing the good, true, and beautiful and doing the good, seeking the truth, and creating beauty is huge. Still, it seems to me better to know than not. Knowing may not be half the battle but perhaps it is a tenth of the battle. And even that little bit is worth something. In the meantime, we should proceed full speed ahead with rewiring our brains.

A Graph of Cognitive Biases

Every Single Cognitive Bias in One Infographic

Courtesy of: Visual Capitalist – (To view, click on the link above.)

This is the most complete graph I’ve ever seen of cognitive biases. It is especially timely as the mechanisms for social and political control grow ever more sophisticated at manipulating human behavior based on an understanding of how poorly our brains work. Hopefully, an increased awareness of our many brain bugs will help us to differentiate between truth and falsity.

In fact, our very survival probably depends on combatting the influence of our reptilian brains and the medieval institutions they created in a world of increasing technological power. Unless we can find a way to enhance our moral and intellectual faculties, our extinction is likely if not inevitable.

Let us face up to what we really are, and transform ourselves.

Psychological Impediments to Good Thinking

(This article was reprinted in the online magazine of the Institute for Ethics & Emerging Technologies, April 16, 2017.)

“Nothing is so firmly believed as what we least know.” ~ Montaigne

Good reasoning is hard because we are largely non-rational and emotional beings.1 Here are some psychological impediments to good reasoning. Remember that I am not getting paid to share this information about cogent reasoning. I could make more money going on late night TV and peddling nonsense like so many others do. My goal is to educate.

1.  Loyalty, the herd Instinct, and provincialism impede good reasoning 

Loyalty – Our chances of surviving and flourishing increase if we are loyal to our in-group.

Herd Instinct – Keeps our beliefs and actions within boundaries set by group. Wanting status, we pontificate about things we know nothing about, and we don’t tend to change our minds or admit mistakes. We’d rather keep being wrong than admit we were wrong.

Provincialism – We identify with the ideas and behaviors of the in-group. We see things from our in-group’s point of view and reject unpleasant truths about our groups.

2. Prejudice, stereotypes, scapegoats, and partisan mind-sets 

Prejudice – Loyalty and provincialism lead to prejudice against out-groups, and to thinking in terms of,

Stereotypes –  Having bad opinions about individuals or groups that aren’t justified by the evidence. This promotes out-group intolerance and tolerance of in-group foibles.

Scapegoats – People or groups who we blame for our problems—Jews, Latinos, atheists and today, especially, immigrants. Faced with complex problems, people accept scapegoating and then don’t have to be self-reflective. All of this leads to a

Partisan Mind-Set – Perceiving evidence and judging arguments from our side only. Good thinkers have an open mind about truth; they know a counter argument may be good.

3. Superstitious Beliefs – There are coincidences—bad things happen after mirrors break and astrology columns are occasionally correct, just like broken clocks are right twice a day. But sensible beliefs are based on sufficient evidence, not small or biased samples. Bad things happen on Friday the 13th, just like they do on other days.

4. Wishful Thinking & Self-Deception – In general, we believe things we want to be true and deny things we find distasteful.

Wishful Thinking –  Believing things we want to be true but which are almost certainly not.

Self-deception – Believing something that at a deeper level we know isn’t true. (This can have tragic consequences. For example, think about colonialism.)

5. Rationalization and Procrastination 

Rationalization – A kind of self-deception when we ignore or deny unpleasant evidence so we can feel justified in doing or believing what we want to do or believe. This often leads to

Procrastination – The favoring of immediate gratification over long-term health goals. We have a bias for short-term gain over long-term ones.

6. Unconscious psychological strategies, or defense mechanisms

Suppression – We avoid thinking about stressful thoughts and thus avoid anxiety.

Denial – We deny our situation by interpreting it as less stressful.

7. Benefits of Self-Deception, Wishful Thinking, & Denial

Self-deception reduces stress and anxiety. Rejecting doubt for belief may do the same. The placebo effect or fondly recalling a bad past may also be examples of the value of SD. But then while self-deception helps young people fight wars, it also helps leaders send young people to their death.

8. Pseudoscience & the Paranormal 

Scientists generally know what they’re talking about. That’s why cell phones, computers, cars, lights, airplanes, ships, GPS, antibiotics, vaccines, furnaces, air conditioners, TVs, dentistry, raincoats, hiking boots, etc. work. That’s why about 1/2 of all newborns no longer die before age 2, as they did for almost all of human history. That’s why you have clean water from your tap! Yet people still believe in

Pseudoscience – About 1/3 or more Americans—in the 21st century!—believe in each of the following: ghosts, witches, alien UFOs, alien abductions, astrology, and more. Why are people so gullible?

One reason is that although real science produces results, it is esoteric and often tells us what we don’t want to know. It tells us the universe probably arose spontaneously out of nothing, that it is unimaginably large, that it will eventually die, that we are modified monkeys, and that we have cognitive biases against believing such things. But pseudoscience usually tells us positive things. The astrologer and fortune-teller predict good things and the medium tells us we can talk to our dead relatives.

Consider astrology. Why do people believe in it when it has been shown to be worthless? (If your destiny is determined by the star under which you are born, then all born under that star should have the same destiny. But they don’t.) Part of the reason people still accept it is that charlatans have found ways to make such nonsense seem plausible. They make horoscopes so vague that anyone can see themselves in their description.

Or consider paranormal phenomena like  ESP in all its varieties. Summarizing over a century of research, the National Research Council has concluded that there is no scientific justification whatsoever for believing in such things—not a single shred of evidence. Why then do people persist in believing? Research has shown that personal experience plays a large role in such beliefs. For example, a gambler attributes a winning streak to some special power rather than the random fluctuation we call luck.

Premonitions are similar. They are just a coincidence between thoughts and events. If you have enough thoughts, sometime events will correspond to them. People remember when premonitions come true, and forget when they don’t. (This is called confirmation bias.) While pseudoscience remembers successes and ignores failures, science places it hypothesis to severe tests, with independent verification and repetition of observations and experiments. Real science doesn’t rely on coincidence or anecdote.

9. Lack of a Good Sense of Proportion 

The basic reason we have so many cognitive bugs has to do with our evolutionary history. Behavior guided by the intellect is a late evolutionary development compared to responses motivated by desires and emotions which affect the more primitive parts of the brain. (So you may fear flying rather than driving even though the former is much safer. Or you may fear dying from a foreign terrorist attack even though you are infinitely more likely to be killed by a fellow citizen.) Our primitive brains evolved in biological eras when almost nothing was known about cause and effect. Modern science is very recent.

_______________________________________________________________________

1. This is a summary of Chapter 6 of, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life, a great college textbook out of which I have taught many times.)

Do You Have A Right To Your Opinion? Trump & Millions of Illegal Votes

(This article was reprinted in the online magazine of the Institute for Ethics & Emerging Technologies, December 13, 2016.)

Trump: “I believe that cows can jump over the moon.”
Question: “Is that really true?”
Pence: “He has a right to his opinion.”
Conway: “He is presenting an alternative fact.”

Donald Trump recently tweeted: “In addition to winning the electoral college in a landslide, I won the popular vote if you deduct the millions of people who voted illegally.” Mike Pence defended this false statement by saying: “He’s entitled to express his opinion on that.” (Here is the video.) As someone who has devoted his life to a search for truth, such lying, obfuscation, and bad thinking is painful to watch. I honestly believe that lying is the original source of most human suffering.

This exchange and the recent piece, “A philosophy professor explains why you’re not entitled to your opinion,” reminded me that about two years ago  I wrote a five-part entry on this blog about critical thinking. The first part was titled: “The Basics of Critical Thinking Part 1: You Don’t Always Have A Right To Your Opinion.” Given the new post-truth world we inhabit, I thought it might be wise to post an excerpt from that post. Here it is.

Let’s begin by asking:  Are you always entitled to your own opinion? Consider, for example, that you claim evolution is “just” a theory. I point out that the word theory has a very special meaning in science—it means what normal people means by “true beyond any reasonable doubt.” I explain to you that the “theory” of gravity or relativity or the atom are theories in the scientific sense, and that they bring together millions of observations. I then explain that multiple branches of science converge on evolution—zoology, botany, genetics, molecular biology, geology, chemistry, anthropology, etc. I show you that virtually no legitimate biologist denies evolution. Now suppose your respond, “well I disagree, and I have a right to my opinion.” Is that relevant? No it isn’t! I wasn’t claiming that you didn’t have a right to an opinion, I was showing you that your opinion is wrong. Being entitled to your opinion doesn’t show your opinion corresponds to the facts, it just shows that you believe something.

… Now you do have a right to believe anything you want, no matter how groundless, if by entitled you mean the political or legal interpretation of rights. Free speech allows you to ignorantly profess: “the earth is flat,” or “the moon is made of green cheese.” But you don’t have a right to believe anything if by entitled you mean an epistemic (knowledge, concerned with truth) right. In that sense you are entitled to believe something only if you have good evidence, sound arguments, and so on. This is the distinction that causes difficulty. As a result, many people believe that their opinions are sacred and others must handle them with care. And, when confronted with counterarguments, they don’t consider that they might be wrong, instead they take offense.

To understand why you don’t have an epistemic right to your opinion ask what duty I have that corresponds to your right to hold some opinion. (Having a right, implies that others have a duty to respect it. If you have a right to free speech, I do have the duty to let you speak.) Do I have the obligation to agree with you? Surely not, since supposedly I have a right to my opinion which may be different from yours. Do I have the obligation to listen to you? No, since I can’t listen to everyone, and some people, for example, are just scientifically illiterate. I don’t consult them about physics. Do I have the obligation to let you keep your opinion? Not always. If you don’t see an oncoming car as you start to cross the street, then I ought to try to change your mind about crossing that street, assuming that you don’t want to hit by a car. O,r if you don’t see the rise of an authoritarian regime, I ought to try to change your mind about supporting it. And if someone is really interested in what’s true, they won’t take the presentation of counter evidence as an injury.

Of course many persons aren’t interested in what’s true; they just like believing certain things. If pressed about their opinions, they find it annoying and say: “I have a right to my opinions.” There are many reasons for this. Their false belief may be part of their group identity, or they may find it painful to change their minds, or they may be ignorant of other opinions, or they may profit from holding their opinion, etc.

But if someone continues to defend themselves with “I have a right to my opinion,” you can be assured of one thing—they aren’t interested in whether their opinion is true or not.