If Facts Won’t Change Our Mind, What Will?

From a blogging perspective, it’s one of my favorite days of the month.

It’s the day that Granted arrives in my inbox.

Every month, Adam Grant, professor of psychology at the Wharton School at the University of Pennsylvania and best-selling author, releases a newsletter called Granted that shares both some of the research he is currently working on as well as some of the favorite research he has come across by others.

The newsletter is usually good source material for one to two blog posts, and today’s newsletter is no exception.

Here’s his description of a article, Why Facts Don’t Change Our Minds, by Elizabeth Kolbert (the author of the best-selling The Sixth Extinctionthat was in a recent issue of The New Yorker:

We spot the weaknesses in other people’s arguments, but we’re blind to our own. One of the most important articles I’ve read in a long time.

How could I pass up reading an article that comes so highly endorsed?

And it was a fascinating look at how people stick to their opinion on something, even if there are facts that clearly contradict their opinion.

The article starts off describing a study done at Stanford in 1975, where the researchers concluded, “Once formed, impressions are remarkably perseverant.”

A few years later, a new set of Stanford students was recruited for a related study. In this study, the researchers noted that even after the evidence ‘for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs”. In this case, the failure was “particularly impressive,” since the research study only used two data points, not enough information to generalize from.

Kolbert notes that while such results may have been shocking in the 1970s, today, after thousands of experiments have supported such results, it is clear that that reasonable-seeming people are often totally irrational.

What’s interesting now is to figure out why people act in such a manner. She cites the work of Hugo Mercier and Dan Sperber, authors of the book, The Enigma of Reason, who claim thatReason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.”

The two cognitive scientists then go on to say that reason is an adaptation to the hypersocial niche humans have evolved for themselves. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

They use confirmation bias as an example, noting that while such a trait defies reason, it must have had some value to survive as a trait. The two authors prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own. They note that for our ancestors, there was little advantage in reasoning clearly, while much was to be gained from winning arguments.

Kolbert then shares the work of another pair of cognitive scientists, Steven Sloman and Philip Fernbach. They, too, believe sociability is the key to how the human mind functions or, perhaps more pertinently, malfunctions, and have written a  book, “The Knowledge Illusion: Why We Never Think Alone”. They have identified what they refer to as “the illusion of explanatory depth,” – people believe that they know way more than they actually do.

They note that we’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate that we can hardly tell where our own understanding ends and others’ begins.

“As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write.

To me, that’s a scary thought. People often share/push those strong feelings with others, and suddenly those feelings become facts, and people dismiss anything that appears contrary to such “facts”.

The last bit of research that Kolbert shares is that of a father-daughter team, Jack Gorman, a psychiatrist, and his daughter, Sara Gorman, a public-health specialist. The duo wrote “Denying to the Grave: Why We Ignore the Facts That Will Save Us” which probes the gap between what science tells us and what we tell ourselves. Their concern is with those persistent beliefs which are not just demonstrably false but also potentially deadly, such as an anti-vaccine mindset or that owning a gun makes you safer.

They also bring up confirmation bias, citing research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs. “It feels good to ‘stick to our guns’ even if we are wrong,” they observe.

The Gormans don’t just want to catalogue the ways we go wrong; they want to correct for them. But in trying to do so, they encounter the very problems they have enumerated. Providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.

So what to do?

Well when it comes to evaluating policy proposals, Sloman and Fernbach suggest that if we would spend less time pontificating and more time trying to work through the implications of policy proposals, we’d realize how clueless we are and moderate our views.

I think part of this is also the inability of many people to simply admit they are wrong on something, and thus they stick by their original assertions, even in the face of overwhelming evidence against such assertions.

It would be nice if we could make our decisions using “just the facts”.

*Apparently Sergeant Joe Friday never used the phrase, “Just the facts, ma’am” The phrase was actually, “All we want are the facts, ma’am”. In a post about the importance of using facts, I thought you should know…

Leave a comment