Is Truth an outdated concept?
Are we living in a post-truth world?
In 2005 the American Dialect Society's word of the year was "truthiness," popularized by Stephen Colbert on his news show satire The Colbert Report, meaning "the truth we want to exist." In 2016 the Oxford Dictionaries nominated as its word of the year "post-truth," characterizing it as "relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief." In 2017 "fake news" increased in usage by 365 per cent, earning the top spot on the "word of the year shortlist" of the Collins English Dictionary, which defined it as "false, often sensational, information disseminated under the guise of news reporting."
Are we living in a post-truth world of truthiness, fake news and alternative facts? Has all the progress we have made since the scientific revolution in understanding the world and ourselves been obliterated by a fusillade of social media postings and tweets? No. As Harvard University psychologist Steven Pinker observes in his resplendent new book Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (Viking, 2018), "mendacity, truth-shading, conspiracy theories, extraordinary popular delusions, and the madness of crowds are as old as our species, but so is the conviction that some ideas are right and others are wrong."
Even as pundits pronounced the end of veracity and politicians played loose with the truth, the competitive marketplace of ideas stepped up with a new tool of the Internet age: real-time fact-checking. As politicos spin-doctored reality in speeches, fact-checkers at Snopes.com,
FactCheck.org and OpenSecrets.org rated them on their verisimilitude, with PolitiFact.com waggishly ranking statements as True, Mostly True, Half True, Mostly False, False, and Pants on Fire. Political fact-checking has even become clickbait (runner-up for the Oxford Dictionaries' 2014 word of the year), as PolitiFact's editor Angie Drobnic Holan explained in a 2015 article: "Journalists regularly tell me their media organizations have started highlighting fact-checking in their reporting because so many people click on fact-checking stories after a debate or high-profile news event."
Far from lurching backward, Pinker notes, today's fact-checking ethic "would have served us well in earlier decades when false rumors regularly set off pogroms, riots, lynchings, and wars (including the Spanish-American War in 1898, the escalation of the Vietnam War in 1964, the Iraq invasion of 2003, and many others)." And contrary to our medieval ancestors, he says, "few influential people today believe in werewolves, unicorns, witches, alchemy, astrology, bloodletting, miasmas, animal sacrifice, the divine right of kings, or supernatural omens in rainbows and eclipses."
Ours is called the Age of Science for a reason, and that reason is reason itself, which in recent decades has come under fire by cognitive psychologists and behavioral economists who assert that humans are irrational by nature and by postmodernists who aver that reason is a hegemonic weapon of patriarchal oppression. Balderdash! Call it "factiness," the quality of seeming to be factual when it is not. All such declarations are self-refuting, inasmuch as "if humans were incapable of rationality, we could never have discovered the ways in which they were irrational, because we would have no benchmark of rationality against which to assess human judgment, and no way to carry out the assessment," Pinker explains. "The human brain is capable of reason, given the right circumstances; the problem is to identify those circumstances and put them more firmly in place."
Despite the backfire effect, in which people double down on their core beliefs when confronted with contrary facts to reduce cognitive dissonance, an "affective tipping point" may be reached when the counterevidence is overwhelming and especially when the contrary belief becomes accepted by others in one's tribe. This process is helped along by "debiasing" programs in which people are introduced to the numerous cognitive biases that plague our species, such as the confirmation bias and the availability heuristic, and the many ways not to argue: appeals to authority, circular reasoning, ad hominem and especially ad Hitlerem. Teaching students to think critically about issues by having them discuss and debate all sides, especially articulating their own and another's position is essential, as is asking, "What would it take for you to change your mind?" This is an effective thinking tool employed by Portland State University philosopher Peter Boghossian.
"However long it takes," Pinker concludes, "we must not let the existence of cognitive and emotional biases or the spasms of irrationality in the political arena discourage us from the Enlightenment ideal of relentlessly pursuing reason and truth." That's a fact.
Michael Shermer is publisher of Skeptic magazine (www.skeptic.com) and a Presidential Fellow at Chapman University. His new book is Heavens on Earth: The Scientific Search for the Afterlife, Immortality, and Utopia (Henry Holt, 2018).
This article was originally published with the title "Factiness". Source: Scientific American