I think we owe ourselves a congratulations. We got through the holidays! The holiday-less S.A.D.-inducing winter spans before us and the countdown to new TV shows and MLK day begins.
Amid all of my complaining about 2016 and the politics of gift-giving, I had forgotten to expect one thing that can actually make the holidays challenging: just spending time with family…occupying the same dinner table, digging into the same refrigerator, watching the same movies with a group of people we never chose our relation to. It was only a matter of time before our great Uncle Fabio–we all have one–staggered through the doorway with all sorts of opinions about the president-elect, about the state of affairs, about the millennials. And it felt right to set him straight: “Uncle Fabio, let’s talk through this. The science says this… The facts say this…” In our best case scenarios, Uncle Fabio was stunned into silence; in the worst, he ran out in a raging fit.
When it comes to hot topics like politics or religion, rarely will our most rational facts actually persuade our opponents. Interestingly enough, no less than three Dans have weighed in on this: Dan Kopf, over at Quartz, recently covered Dan Kahan’s decade-long study about how even the most rational thinkers prioritize their own biases—which draws inspiration from Nobel-prize winner/Mockingbird muse Dan Kahneman’s concept of the two system mind. “System 1” thinking is faster, more heuristic and emotion-driven; by contrast, “System 2” is the rational system, the slower thinking which relies heavily on empirical evidence and calculations.
Generally speaking, System 1 thinking is considered more biased and therefore less constructive while System 2 should be more detached and therefore less partisan. “If only we would all just use our rational, scientific minds,” Kopf writes. “Then we could get past our disagreements. It’s a nice thought. Unfortunately, it’s wrong.” In his ten-year study, Kahan concluded that not only does System 2 reasoning prove unhelpful in creating common ground but it can actually reinforce pre-existing biases.
Rather than use our best thinking to reach the truth, we use it to find ways to agree with others in our communities.
“The process is called biased assimilation,” says Kahan. “People will selectively credit and discredit information in patterns that reflect their commitment to certain values.” …
In one illustrative study, Kahan asked over 1,500 respondents whether they agreed or disagreed with the following statement: “There is solid evidence of recent global warming due mostly to human activity such as burning fossil fuels.” For these same respondents, Kahan also collected information on their political beliefs, and measured their “science intelligence”—a metric based on answers to questions developed by the National Science Foundation, Pew Research Center, and others. These questions are intended to gauge a combination of scientific knowledge and quantitative reasoning proficiency.
Quick interjection: Kahan’s “scientific intelligence” metric isn’t a general intelligence rank; rather, its focus is on parsing out a distinction between System 1 and System 2 thinkers.
When Kahan analyzed the data he found that those with the least science intelligence actually have less partisan positions than those with the most. A Conservative Republican with strong science intelligence will use their skills to find evidence against human-caused global warming, while a Liberal Democrat will find evidence for it. This is also true for issues like fracking, evolution, and the risks associated with gun possession—whatever your preconceived political belief on this issue, you’ll use your scientific intelligence to try to prove you’re right.
Kahan explains that “individuals can be expected to form identity-protective beliefs and to use all of the cognitive resources at their disposal to do so.” Regardless of “scientific intelligence” levels, those who rely on more System 2 thinking demonstrate the greatest attachment to preconceived beliefs, thus using their predisposition to reason to defend those beliefs. Kahan goes on to explain:
Individuals highest in the critical reasoning dispositions associated with System 2 information processing were using their cognitive proficiencies to ferret out evidence consistent with their cultural or ideological predispositions and to rationalize the peremptory dismissal of evidence inconsistent with the same.
When we say “science proves this,” we are really saying “your beliefs are invalid, Uncle Fabio, and here’s why.” We’re ultimately trying to control our opponent–or, at the very least, impose an expectation on him; a law which will determine who is right(eous) and who is wrong. It will decide which one of us is the good guy, and which one of us is the bad guy.
Kahan’s research suggests, however, that using System 2 thinking as a means of controlling another person rarely works. As the Bible implies, the imposition of the law increases the trespass. When God tells Adam and Eve not to eat from the tree, what do they do? Eat from it. The more our “rational” argument accuses another’s beliefs, they more intensely they’ll defend those beliefs. This may explain why, in my apologetics phase, I succeeded in converting exactly zero people. (That said, in my non-apologetics phase, I’m still batting .000. Hmm…)
When we are accused and accusing, judging and being judged, we desperately self-justify in fear. Which may have something to do with Kahan’s next point, that System 2 thinkers tend to defend stances they hadn’t scientifically researched for themselves:
Perhaps Kahan’s most disconcerting finding is that people with more scientific intelligence are the quickest to align themselves politically on subjects they don’t know anything about. In one experiment, Kahan analyzed how people’s opinion on a unfamiliar subject are affected when given some basic scientific information, along with details about what people in their self-identified political group tend to believe about that subject. It turned out that those with the strongest scientific reasoning skills were the ones most likely to use the information to develop partisan opinions.
Kahan argues that this is actually a very rational way of using our best thinking. “A person who forms a position out of line with her cultural peers risks estrangement from the people on whom she depends on for emotional and material support,” writes Kahan. Better to use your intellectual faculties to stick to the company line.
In other words, rationality doesn’t work to serve the truth but to serve itself, out of (rational) self-preservation. When we defend our political or religious positions, it’s less about the defending those positions and more about defending ourselves as people, fragile as we are.
…ironically, I think I probably would have believed this before Kahan had researched it.