Pursuant to our recent posts concerning differentiating truth from falsity, especially in science, I happened upon a piece in the New York Times titled “When Belief and Facts Collide.” The author is Brendan Nyhan, PhD in political science from Duke and currently Assistant Professor of Government at Dartmouth. (Nyhan has been described as a “liberal to moderate” political blogger, although in 2006 “he came under attack from the editors [of The American Prospect] for unwarranted criticism of liberal pundits.”1)
Nyhan begins by asking “Do Americans understand the scientific consensus about issues like climate change and evolution?” The answer, Nyhan found, is no. Moreover, “… beliefs on both topics are divided along religious and partisan lines. For instance, 46 percent of Republicans said there is not solid evidence of global warming, compared with 11 percent of Democrats.” This suggests that people may not be aware of the scientific consensus on such issues and need to be better informed. They many not know that evolution is as certain in science as gravity or that 97% of climate scientists believe human activities are causing global warming.
However some studies have found that knowing about the science makes little difference in people’s beliefs. They may know the science but be unwilling to believe it when it contradicts cherished political or religious views. “This finding helps us understand why my colleagues and I have found that factual and scientific evidence is often ineffective at reducing misperceptions and can even backfire on issues like weapons of mass destruction, health-care reform, and vaccines. With science as with politics, identity often trumps the facts.”
What should we do? Nyhan suggests we might “try to break the association between identity and factual beliefs on high-profile issues–for instance, by making clear that you can believe in human-induced climate change and still be a conservative Republican … or an evangelical Christian …” He also argues we “need to reduce the incentives for elites to spread misinformation to their followers in the first place. Once people’s cultural and political views get tied up in their factual beliefs, it’s very difficult to undo regardless of the messaging that is used.” To dissuade purveyors of misinformation we might increase “the reputational costs for dishonest elites might be a more effective approach to improving democratic discourse.” (Or let factcheck.org or similar groups play a bigger role in informing the public. Whether this will work is another matter.)
And, as Nyhan notes,
The deeper problem is that citizens participate in public life precisely because they believe the issues at stake relate to their values and ideals, especially when political parties and other identity-based groups get involved … Those groups can help to mobilize the public and represent their interests, but they also help to produce the factual divisions that are one of the most toxic byproducts of our polarized era. Unfortunately, knowing what scientists think is ultimately no substitute for actually believing it.
In the end, I find myself at an impasse. As I argued in my last post,“When Should We Argue?, some arguments are futile because, as E. O. Wilson said, people don’t want to know, they want to believe. I find this all so depressing. Still I will conclude as I did in my previous post.
… as I age I find myself, as Thornton Wilder said, being weaned away from life. During this process we should try to better the world, while sustaining the hope that new generations will continue the endless fight for truth and the justice. (In a future post I hope to address two of the greatest ideas in the history of human culture–truth and justice.)
And in my next post I will discuss these two great, great ideas.