Category Archives: Critical Thinking

The Will to Doubt: Summary of Bertrand Russell’s “Free Thought and Official Propaganda”

Conway Hall Entrance

Conway Hall, 25 Red Lion Square, London, WC1R 4RL

What is wanted is not the will-to-believe, but the wish to find out, which is its exact opposite. ~ Bertrand Russell

In 1922 Bertrand Russell delivered his Conway Memorial Lecture, “Free Thought and Official Propaganda,” to the South Place Ethical Society, the oldest surviving freethought
organization in the world and the only remaining ethical society in the United Kingdom. (It is now called the Conway Hall Ethical Society.) The lecture was later included in his anthology The Will to Doubt

The main theses of the lecture are to: 1) advocate for freedom of expression; 2) champion the will to doubt; 3) explain the origins of dogmatism; and 4) promote critical thinking.

Free Expression

Russell begins by noting his agreement with the common definition of “free thought” as the rejection of popular religious beliefs.

I am myself a dissenter from all known religions, and I hope that every kind of religious belief will die out. I do not believe that, on the balance, religious belief has been a force for good. Although I am prepared to admit that in certain times and places it has had some good effects, I regard it as belonging to the infancy of human reason …

However, Russell argues that the term should also refer more broadly to having and being allowed to express any opinion without penalty. Yet many ideas—for example, anarchism
or polygamy—are considered so immoral that we don’t tolerate them. But suppression of unpopular ideas is exactly the view that allowed torture during the Inquisition.

Russell then describes incidents in his own life to illustrate the lack of freedom of thought.

  1. He was forced to be raised Christian despite his dying father’s wishes.
  2. He lost the Liberal Party nomination for Parliament because he was an agnostic.
  3. He was denied a Fellowship at Trinity College because he was considered too “anti-clerical.” And when he later expressed opposition to World War I, he was fired.

Russell concludes this section by advocating total freedom of expression.

The Will to Doubt

Next, Russell turns to the importance of the will to doubt. He was responding to William James‘ notion of the will to believe. James had claimed that even without (or with conflicting) evidence, one might be justified in choosing to believe in something—like Christianity for example—simply because it may have beneficial outcomes. But this “will to believe,”  binds one to many untruths and halts the search for further truths.

Russell contrasts such an attitude with what he calls “the will to doubt,” which is choosing to remain skeptical as a means of eventually understanding more truth.

William James used to preach the “will-to-believe.” For my part, I should wish to preach the “will-to-doubt.” None of our beliefs are quite true; all have at least a penumbra of vagueness and error. The methods of increasing the degree of truth in our beliefs are well known; they consist in hearing all sides, trying to ascertain all the relevant facts, controlling our own bias by discussion with people who have the opposite bias, and cultivating a readiness to discard any hypothesis which has proved inadequate. These methods are practiced in science, and have built up the body of scientific knowledge … In science, where alone something approximating to genuine knowledge is to be found, [it’s] attitude is tentative and full of doubt.

In religion and politics on the contrary, though there is as yet nothing approaching scientific knowledge, everybody considers it de rigueur to have a dogmatic opinion, to be backed up by inflicting starvation, prison, and war, and to be carefully guarded from argumentative competition with any different opinion. If only men could be brought into a tentatively agnostic frame of mind about these matters, nine-tenths of the evils of the modern world would be cured. War would become impossible, because each side would realize that both sides must be in the wrong. Persecution would cease. Education would aim at expanding the mind, not at narrowing it. [People] would be chosen for jobs on account of fitness to do the work, not because they flattered the irrational dogmas of those in power.

As an example of the benefits of this kind of actual skepticism, Russell describes Albert Einstein‘s overturning of the conventional wisdom of physics and Darwin‘s contradicting the Biblical literalists. As soon as there was convincing evidence of these truths, scientists provisionally accepted them. But they didn’t dogmatically regard them as the final word incapable of further refinement.

Russell states his conclusion of this section in a single, concise sentence, “What is wanted is not the will-to-believe, but the wish to find out, which is its exact opposite.

Dogmatism

Yet despite the fact that rational doubt or fallibilism is so important, individuals and cultures often adopt an irrational certainty regarding complicated issues. But why? Russell believes this results partly “due to the inherent irrationality and credulity of average human nature.” But three other agencies exacerbate these natural tendencies:

1 – Education — Public education doesn’t teach children healthy learning attitudes, but often indoctrinates children with often patently false dogma. As he puts it:

Education should have two objects: first, to give definite knowledge—reading and writing, language and mathematics, and so on; secondly, to create those mental habits which will enable people to acquire knowledge and form sound judgments for themselves. The first of these we may call information, the second intelligence.

2 – Propaganda — People aren’t taught to weigh the evidence and form original opinions, so they have little protection against dubious or false claims. As Russell states: “The objection to propaganda is not only its appeal to unreason, but still more the unfair advantage which it gives to the rich and powerful.”

3 – Economic pressure — The State and political class use its control of finances and economy to impose its ideas, restricting the choices of those who disagree. They want conformity. In Russell’s words:

There are two simple principles which, if they were adopted, would solve almost all social problems. The first is that education should have for one of its aims to teach people only to believe propositions when there is some reason to think, that they are true. The second is that jobs should be given solely for fitness to do the work.

This second point led Russell to emphasize tolerance: “The protection of minorities is vitally important; and even the most orthodox of us may find himself in a minority some day, so that we all have an interest in restraining the tyranny of majorities.”

Critical Thinking

And tolerance for Russell connects with the will to doubt: “If there is to be toleration in the world, one of the things taught in schools must be the habit of weighing evidence, and the practice of not giving full assent to propositions which there is no reason to believe true.” While Russell doubts that our moral defects can be easily improved, he argues that we can improve our intellectual virtue. Note the prescience of his ideas regarding disinformation:

Therefore, until some method of teaching virtue has been discovered, progress will have to be sought by improvement of intelligence rather than of morals. One of the chief obstacles to intelligence is credulity, and credulity could be enormously diminished by instructions as to the prevalent forms of mendacity. Credulity is a greater evil in the present day than it ever was before, because, owing to the growth of education, it is much easier than it used to be to spread misinformation, and, owing to democracy, the spread of misinformation is more important than in former times to the holders of power.

Russell concludes by asking how we might nurture a world where critical thinking reigns.

If I am asked how the world is to be induced to adopt these two maxims — namely: (1) that jobs should be given to people on account of their fitness to perform them; (2) that one aim of education should be to cure people of the habit of believing propositions for which there is no evidence—I can only say that it must be done by generating an enlightened public opinion. And an enlightened public opinion can only be generated by the efforts of those who desire that it should exist.

My brief thoughts

If we are educated to think freely and critically, which itself encourages the will to doubt,  the human condition would improve. Only if we emphasize the truth, rather than lies and propaganda, can we create a world where we all can survive and flourish. After a lifetime of pursuing truth, I have concluded that lying may be the greatest sin of all.

Fox News: An Information Ecosystem For Fallicious Reasoning

Fox News Channel logo.svg
© Darrell Arnold, Ph.D. – (Reprinted with Permission)  http://darrellarnold.com/2018/07/16/fox-news-an-information-ecosystem-for-fallacious-reasoning/

Fox news has long functioned in the service of America’s conservative movement. What is particularly concerning is their willingness to do so even if it undermines truth and sound reasoning. One way that it regularly undermines truth and clear reasoning is through subtext in the framing of their stories. Another is through editorial decisions on what stories to feature — not the framing of their stories but their framing of their news environment. This creates a kind of informational ecosystem that facilitates fallacious reasoning for ideological purposes. As an example of this latter issue, on their webpage on the day of the Helsinki Summit, after many hours they finally featured a story about “bipartisan backlash.” Yet one of their other featured stories of the day was about — you guessed it — Hillary and Bill Clinton. The day’s lead story thus dealt with the issue of the day. The other though didn’t but served as a reminder for the Fox viewer of all the “Clinton scandals.”

Fox’s decisions in this instance, like on so many others, borders on serving as propaganda, and the informational ecosystem they create facilitates fallacious reasoning in service to their ideological agenda. By selective editorial placement a modern media company like Fox will not exactly commit the Tu Quoque fallacy, sometimes known as Whataboutism (though they do this often in enough of their commentary) — but it can certainly facilitate it among its audience, or create an environment for it.

Whataboutism is a version of the Tu Quoque fallacy. Tu Quoque means “you too” or “who’s talking?” The fallacy in its various forms functions as a diversion, directing people’s attention away from the issue under discussion by focusing on how those making a particular accusation are themselves guilty of the same kind of thing they are accusing others of. Whataboutism in a context of a political scandal will often draw attention away from a scandal by pointing out that an opposing political party also has committed scandals that are just as bad or worse.

In our given example, on the one hand, we have an extraordinarily serious scandal of a sitting U.S. president undermining his own intelligence agencies and the preceding U.S. administration, with no apparent reason other than the fact that Vladimir Putin very strongly protested the charges. Indeed, the event was serious enough that Senator John McCain called it “one of the most disgraceful performances of an American president in memory.” John Brennan, former director of the Central Intelligence Agency, said “Donald Trump’s press conference … rises to and exceeds the threshold of ‘high crimes and misdemeanors.’” Various Democrats also clearly condemned the president’s actions at the summit.

The bipartisan reaction to Trump’s performance is highlighted in the main story of the news of the day. On the other hand, we have the “scandals” about the Clintons, which have been thoroughly investigated but already dismissed. In the case of the Hillary “scandals,” for example, the Benghazi Affair was Congress’s longest investigation, costing over 7 million dollars. The Email controversy was investigated by various bodies over numerous years. In both cases, no criminal wrongdoing was found. But a new story about the Clintons is placed a story or two under the lead story of the day. So we have the Trump issue presented, but the Fox audience is at the same time presented with the question: What about Hillary? What about the Clintons?

Fox is facilitating the Whataboutism fallacy as it subtly suggests a false equivalency between a very real scandal of the day and a trumped up one. This is something they have done regularly over the years and continue to this day, bringing up both of Hillary’s controversies on a regular basis. Through their editorial decisions about what stories to feature and highlight (and to run regardless of credibility) and then by decisions about how to place these in proximity to other stories, media companies and information providers like Fox can serve the purpose of facilitating fallacious reasoning, and in the case of Fox to do so for reasons of ideology.

Though this observation here focuses on one story, an observer of Fox will easily find that this is going on regularly. Add to it the regular commentary of certain media figures and the regularity with which it brings up stories with little credibility at all, and we can see that Fox in particular is essentially an information ecosystem that continually nudges toward fallacious reasoning for ideological purposes. It has long been doing this at the service of the Republican’s moves to undermine democratic norms. Now it is apparently pleased to assume the role in support of an administration with clear authoritarian characteristics.

Cognitive Bias

Chimp Brain in a jar.jpg

I previously linked to a graph of all known cognitive biases and I have recently encountered a short BBC video which nicely captures four of them. It can be found at:
https://www.bbc.com/ideas/videos/is-your-brain-your-own-worst-enemy/p068vb09

What I found of particular interest was the G.I. Joe fallacy which refers roughly to the idea that knowing is half the battle. But in fact, knowing about, for instance, our cognitive biases doesn’t help much in overcoming those biases. Thus some scholars chose the idea that knowing is half the battle as an idea that must be retired.

As a philosopher, I would say that the G.I. Joe fallacy shows, among other things, that Socrates and Plato were mistaken in believing that knowledge is sufficient for virtue. As Aristotle knew there is a large gap between knowing the right thing and doing it. In a sense this is depressing. Even if we know that our brains bias us in multiple ways that seems to help little in overcoming those biases. The gap between knowing the good, true, and beautiful and doing the good, seeking the truth, and creating beauty is huge. Still, it seems to me better to know than not. Knowing may not be half the battle but perhaps it is a tenth of the battle. And even that little bit is worth something. In the meantime, we should proceed full speed ahead with rewiring our brains.

A Graph of Cognitive Biases

Every Single Cognitive Bias in One Infographic

Courtesy of: Visual Capitalist – (To view, click on the link above.)

This is the most complete graph I’ve ever seen of cognitive biases. It is especially timely as the mechanisms for social and political control grow ever more sophisticated at manipulating human behavior based on an understanding of how poorly our brains work. Hopefully, an increased awareness of our many brain bugs will help us to differentiate between truth and falsity.

In fact, our very survival probably depends on combatting the influence of our reptilian brains and the medieval institutions they created in a world of increasing technological power. Unless we can find a way to enhance our moral and intellectual faculties, our extinction is likely if not inevitable.

Let us face up to what we really are, and transform ourselves.

Psychological Impediments to Good Thinking

(This article was reprinted in the online magazine of the Institute for Ethics & Emerging Technologies, April 16, 2017.)

“Nothing is so firmly believed as what we least know.” ~ Montaigne

Good reasoning is hard because we are largely non-rational and emotional beings.1 Here are some psychological impediments to good reasoning. Remember that I am not getting paid to share this information about cogent reasoning. I could make more money going on late night TV and peddling nonsense like so many others do. My goal is to educate.

1.  Loyalty, the herd Instinct, and provincialism impede good reasoning 

Loyalty – Our chances of surviving and flourishing increase if we are loyal to our in-group.

Herd Instinct – Keeps our beliefs and actions within boundaries set by group. Wanting status, we pontificate about things we know nothing about, and we don’t tend to change our minds or admit mistakes. We’d rather keep being wrong than admit we were wrong.

Provincialism – We identify with the ideas and behaviors of the in-group. We see things from our in-group’s point of view and reject unpleasant truths about our groups.

2. Prejudice, stereotypes, scapegoats, and partisan mind-sets 

Prejudice – Loyalty and provincialism lead to prejudice against out-groups, and to thinking in terms of,

Stereotypes –  Having bad opinions about individuals or groups that aren’t justified by the evidence. This promotes out-group intolerance and tolerance of in-group foibles.

Scapegoats – People or groups who we blame for our problems—Jews, Latinos, atheists and today, especially, immigrants. Faced with complex problems, people accept scapegoating and then don’t have to be self-reflective. All of this leads to a

Partisan Mind-Set – Perceiving evidence and judging arguments from our side only. Good thinkers have an open mind about truth; they know a counter argument may be good.

3. Superstitious Beliefs – There are coincidences—bad things happen after mirrors break and astrology columns are occasionally correct, just like broken clocks are right twice a day. But sensible beliefs are based on sufficient evidence, not small or biased samples. Bad things happen on Friday the 13th, just like they do on other days.

4. Wishful Thinking & Self-Deception – In general, we believe things we want to be true and deny things we find distasteful.

Wishful Thinking –  Believing things we want to be true but which are almost certainly not.

Self-deception – Believing something that at a deeper level we know isn’t true. (This can have tragic consequences. For example, think about colonialism.)

5. Rationalization and Procrastination 

Rationalization – A kind of self-deception when we ignore or deny unpleasant evidence so we can feel justified in doing or believing what we want to do or believe. This often leads to

Procrastination – The favoring of immediate gratification over long-term health goals. We have a bias for short-term gain over long-term ones.

6. Unconscious psychological strategies, or defense mechanisms

Suppression – We avoid thinking about stressful thoughts and thus avoid anxiety.

Denial – We deny our situation by interpreting it as less stressful.

7. Benefits of Self-Deception, Wishful Thinking, & Denial

Self-deception reduces stress and anxiety. Rejecting doubt for belief may do the same. The placebo effect or fondly recalling a bad past may also be examples of the value of SD. But then while self-deception helps young people fight wars, it also helps leaders send young people to their death.

8. Pseudoscience & the Paranormal 

Scientists generally know what they’re talking about. That’s why cell phones, computers, cars, lights, airplanes, ships, GPS, antibiotics, vaccines, furnaces, air conditioners, TVs, dentistry, raincoats, hiking boots, etc. work. That’s why about 1/2 of all newborns no longer die before age 2, as they did for almost all of human history. That’s why you have clean water from your tap! Yet people still believe in

Pseudoscience – About 1/3 or more Americans—in the 21st century!—believe in each of the following: ghosts, witches, alien UFOs, alien abductions, astrology, and more. Why are people so gullible?

One reason is that although real science produces results, it is esoteric and often tells us what we don’t want to know. It tells us the universe probably arose spontaneously out of nothing, that it is unimaginably large, that it will eventually die, that we are modified monkeys, and that we have cognitive biases against believing such things. But pseudoscience usually tells us positive things. The astrologer and fortune-teller predict good things and the medium tells us we can talk to our dead relatives.

Consider astrology. Why do people believe in it when it has been shown to be worthless? (If your destiny is determined by the star under which you are born, then all born under that star should have the same destiny. But they don’t.) Part of the reason people still accept it is that charlatans have found ways to make such nonsense seem plausible. They make horoscopes so vague that anyone can see themselves in their description.

Or consider paranormal phenomena like  ESP in all its varieties. Summarizing over a century of research, the National Research Council has concluded that there is no scientific justification whatsoever for believing in such things—not a single shred of evidence. Why then do people persist in believing? Research has shown that personal experience plays a large role in such beliefs. For example, a gambler attributes a winning streak to some special power rather than the random fluctuation we call luck.

Premonitions are similar. They are just a coincidence between thoughts and events. If you have enough thoughts, sometime events will correspond to them. People remember when premonitions come true, and forget when they don’t. (This is called confirmation bias.) While pseudoscience remembers successes and ignores failures, science places it hypothesis to severe tests, with independent verification and repetition of observations and experiments. Real science doesn’t rely on coincidence or anecdote.

9. Lack of a Good Sense of Proportion 

The basic reason we have so many cognitive bugs has to do with our evolutionary history. Behavior guided by the intellect is a late evolutionary development compared to responses motivated by desires and emotions which affect the more primitive parts of the brain. (So you may fear flying rather than driving even though the former is much safer. Or you may fear dying from a foreign terrorist attack even though you are infinitely more likely to be killed by a fellow citizen.) Our primitive brains evolved in biological eras when almost nothing was known about cause and effect. Modern science is very recent.

_______________________________________________________________________

1. This is a summary of Chapter 6 of, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life, a great college textbook out of which I have taught many times.)