Have you noticed the increase, this election, in talk about “fact-checkers”? I can’t seem to escape articles and tweets about post-debate/-speech tallies of “checked facts.” While no doubt we could use a little help wading through all campaign hyperbole and Wiki-what-have-you, it sometimes seems that we’ve forgotten the time-tested cliche that one man’s fact is another man’s fiction (and always has been). It may be a cynical sentiment but it is also one that strikes me as far less cynical than the apparently widely held conviction that one or both candidates is an unremorseful liar, foisting their contempt for raw data on an unsuspecting public. The truth is, and I’m convinced we all know this on some level, what unites the two candidates is their incredible ability to see the same facts as evidence of conflicting ideologies. Their steadfast loyalty to the facts as they read them, you might say. This isn’t to suggest there’s no such thing as “hard numbers”–of course there are!–just that no one anywhere has ever been able to engage with those numbers non-emotionally, checking(!) their intuitions at the door. I know this because I do it all the time, and so do you. It’s part and parcel of our identity as fallen men and women: we are inveterate and inveterately lousy fact checkers.

On that note, here we go with part three of our four part series on Jonathan Haidt’s marvelous The Righteous Mind: Why Good People Are Divided by Politics and Religion. In part one, we introduced Haidt’s central argument, that when it comes to politics–and reason in general–intuitions come first and strategic reasoning second. We are not ideological “free agents” during election season; we are all deeply identified with a particular set of intuitions that expresses itself in all sorts of self-justifying ways. In part two, we took a closer look at the role of reason in this light, how the primary (but not exclusive!) purpose of strategic reasoning is confirmatory rather than exploratory, and how the debate culture bears this out, to say nothing of our personal relationships. We even made the bold claim that human beings care more about appearance than truth. (Given the nature of this approach, it should come as no surprise that Haidt has provoked such a reaction from certain heavily invested quarters.)

This time we’ll delve a little deeper into the mechanics of how we are able to pull off this political self-justification with such consistency and creativity. And to do so, we need to look at one of the most closet Lutheran sections in the entire book, almost laughably so, in which Haidt unpacks the Must/Can distinction:

When my son, Max, was three years old, I discovered that he’s allergic to ‘must’. When I would tell him that he must get dressed so that we can go to school (and he loved to go to school), he’d scowl and whine. The word must is a little verbal handcuff that triggered in him the desire to squirm free.

The word ‘can’ is so much nicer: “can you get dressed, so that we can go to school?” To be certain that these two words were really night and day, I tried a little experiment. After dinner one night, I said “Max, you must eat ice cream now.”

“But I don’t want to!”

Four seconds later: “Max, you can have ice cream if you want.”

“I want some!”

The difference between Can and Must is the key to understanding the profound effects of self-interest on reasoning. It’s also the key to understanding many of the strangest beliefs-in UFO abductions, quack medical treatments, and conspiracy theories.

The social psychologist Tom Gilovich studies the cognitive mechanisms of strange beliefs. His simple formulation is that when we want to believe something, we ask ourselves, “Can I believe it?” Then, we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe. We have a justification, in case anyone asks.

In contrast, when we don’t want to believe something, we ask ourselves “Must I believe it?” Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it. You only need one key to unlock the handcuffs of must.

Psychologists now have file cabinets full of findings on “motivated reasoning,” showing the many tricks people use to reach the conclusions they want to reach. When subjects are told that an intelligence test gave them a low score, they choose to read articles criticizing (rather than supporting) the validity of IQ tests. When people read a (fictitious) scientific study that reports a link between caffeine consumption and breast cancer, women who are heavy coffee drinkers find more flaws in the study than do men and less caffeinated women…

The difference between a mind asking “Must I believe it?” versus “Can I believe it?” is so profound that it even influences visual perception. Subjects who thought they’d get something good if a computer flashed up a letter than a number were more likely to see the ambiguous figure to the right as the letter B, rather than the number 13.

If people can literally see what they want to see–given a bit of ambiguity–is it any wonder that scientific studies often fail to persuade the general public? Scientists are really good at finding flaws in studies that contradict their own views, but it sometimes happens that evidence accumulates across many studies to the point where scientists must change their minds. I’ve seen this happen in my colleagues (and myself) many times, and it’s part of the accountability system of science–you’d look foolish clinging to discredited theories. But for nonscientists, there is no such thing as a study you must believe. It’s always possible to question the methods, find an alternative interpretation of the data, or, if all else fails, question the honesty or ideology of the researchers.

As always – and this is almost certainly my own confirmation biases talking – these findings are soaked in the reality of Original Sin, what GK Chesterton so brilliantly referred to as the only “empirically verifiable doctrine” (the facts are in!). You and I are in a very real sense captive to intuitions that are outside of our control, and like the God-complex junkies that we are, we will bend the world to fit our understanding of it. That is, “must” may be a handcuff, but the unceasing compulsion to fashion cardboard keys, is the real ball and chain. We insist on being the final arbiters of truth and meaning–not anyone else–certainly not God. We see what we want to see; we hear what we want to hear; we believe what we want to believe, and at no point is this more blatant than during a presidential election season. The illusion of objectivity is just that: an illusion. Perhaps this is the intellectual aspect of what St Paul meant when he wrote about being “slaves to sin”. But again, this is not reason to despair. The God of the Bible is concerned with those who are stuck–emotionally, intellectually, spiritually–in webs of their own making. Which is everyone. Truth, to the extent that it can be known, is revealed to us, not by us. Now comes the jaw-dropping part:

Webster’s Third New International Dictionary defines “delusion” as “a false conception and persistent belief unconquerable by reason in something that has no existence in fact.” As an intuitionist, I’d say that the worship of reason is itself an illustration of one of the most long-lived delusions in Western history: the rationalist delusion. It’s the idea that reasoning is our most noble attribute, one that makes us like the the gods (for Plato) or that brings us beyond the “delusion” of believing in gods (for the New Atheists). The rationalist delusion is not just a claim about human nature. It’s also a claim that the rational caste (philosophers or scientist) should have more power, and it usually comes along with a utopian program for raising more rational children.

From Plato through Kant and Kohlberg, many rationalists have asserted that the ability to reason well about ethical issues causes good behavior. They believe that reasoning is the royal road to moral truth, and they believe that people who reason well are more likely to act morally.

But if that were the case, then moral philosophers–who reason about ethical principles all day long–should be more virtuous than other people. Are they? The philosopher Eric Schweitzgebel tried to find out. He used surveys and more surreptitious methods to measure how often moral philosophers give to charity, vote, call their mothers, donate blood, donate organs, clean up after themselves at philosophy conferences, and respond to emails purportedly from students. And in none of these ways are moral philosophers better than other philosophers or professors in other fields.

Schwitzgebel even scrounged up the missing-book lists from dozens of libraries and found that academic books on ethics, which are presumably borrowed mostly by ethicists, are more likely to be stolen or just never returned than books in other areas of philosophy. In other words, expertise in moral reasoning does not seem to improve moral behavior, and it might even make it worse.

Haidt is almost paraphrasing St. Paul when he talks about how “knowledge puffs up, but love builds up” (1 Cor 8:1). Information and self-knowledge, regardless of how reliable, may have all sorts of uses, but on their own, it is not enough to change a person or their behavior (seldom even their mind). Preachers know this–at least, the good ones do. There’s a difference between a sermon and a lecture, after all. And there’s a huge difference between information about God and God himself. The example of the missing ethics library books is about as perfect an example of this reality as one could fine. That is, formal expertise on the subject of right-doing has not created more upstanding citizens; it has merely made people better at justifying their actions.

Perhaps this is just another long-winded way of saying that ‘justification’ is not just some boutique theological abstraction; it lies behind an embarrassing amount of our daily stress, striving, and divisions. So where’s the hope? Ultimately, it’s in the one who justifies the ungodly of course. But maybe there’s a present-tense upside as well. Having identified our own curved-in interaction with politics and politicians, we may find that we’re marginally more patient with the ideological autopilot in ourselves and others, an empathy that *might* help us take the facts on their own terms a bit more easily, or at least allow us to… survive November.

Last but not least: Partisan Narratives and the Beginning of Universal Sympathy!