Fact-Free Nation
Page 3 of 4

The Science of Why We Don't Believe Science

How our brains fool us on climate, creationism, and the vaccine-autism link.

Another study gives some inkling of what may be going through people's minds when they resist persuasion. Northwestern University sociologist Monica Prasad and her colleagues wanted to test whether they could dislodge the notion that Saddam Hussein and Al Qaeda were secretly collaborating among those most likely to believe it—Republican partisans from highly GOP-friendly counties. So the researchers set up a study (PDF) in which they discussed the topic with some of these Republicans in person. They would cite the findings of the 9/11 Commission, as well as a statement in which George W. Bush himself denied his administration had "said the 9/11 attacks were orchestrated between Saddam and Al Qaeda."

One study showed that not even Bush's own words could change the minds of Bush voters who believed there was an Iraq-Al Qaeda link.

As it turned out, not even Bush's own words could change the minds of these Bush voters—just 1 of the 49 partisans who originally believed the Iraq-Al Qaeda claim changed his or her mind. Far more common was resisting the correction in a variety of ways, either by coming up with counterarguments or by simply being unmovable:

Interviewer: [T]he September 11 Commission found no link between Saddam and 9/11, and this is what President Bush said. Do you have any comments on either of those?

Respondent: Well, I bet they say that the Commission didn't have any proof of it but I guess we still can have our opinions and feel that way even though they say that.

The same types of responses are already being documented on divisive topics facing the current administration. Take the "Ground Zero mosque." Using information from the political myth-busting site FactCheck.org, a team at Ohio State presented subjects (PDF) with a detailed rebuttal to the claim that "Feisal Abdul Rauf, the Imam backing the proposed Islamic cultural center and mosque, is a terrorist-sympathizer." Yet among those who were aware of the rumor and believed it, fewer than a third changed their minds.

A key question—and one that's difficult to answer—is how "irrational" all this is. On the one hand, it doesn't make sense to discard an entire belief system, built up over a lifetime, because of some new snippet of information. "It is quite possible to say, 'I reached this pro-capital-punishment decision based on real information that I arrived at over my life,'" explains Stanford social psychologist Jon Krosnick. Indeed, there's a sense in which science denial could be considered keenly "rational." In certain conservative communities, explains Yale's Kahan, "People who say, 'I think there's something to climate change,' that's going to mark them out as a certain kind of person, and their life is going to go less well."

This may help explain a curious pattern Nyhan and his colleagues found when they tried to test the fallacy (PDF) that President Obama is a Muslim. When a nonwhite researcher was administering their study, research subjects were amenable to changing their minds about the president's religion and updating incorrect views. But when only white researchers were present, GOP survey subjects in particular were more likely to believe the Obama Muslim myth than before. The subjects were using "social desirabililty" to tailor their beliefs (or stated beliefs, anyway) to whoever was listening.

Which leads us to the media. When people grow polarized over a body of evidence, or a resolvable matter of fact, the cause may be some form of biased reasoning, but they could also be receiving skewed information to begin with—or a complicated combination of both. In the Ground Zero mosque case, for instance, a follow-up study (PDF) showed that survey respondents who watched Fox News were more likely to believe the Rauf rumor and three related ones—and they believed them more strongly than non-Fox watchers.

Okay, so people gravitate toward information that confirms what they believe, and they select sources that deliver it. Same as it ever was, right? Maybe, but the problem is arguably growing more acute, given the way we now consume information—through the Facebook links of friends, or tweets that lack nuance or context, or "narrowcast" and often highly ideological media that have relatively small, like-minded audiences. Those basic human survival skills of ours, says Michigan's Arthur Lupia, are "not well-adapted to our information age."

A predictor of whether you accept the science of global warming? Whether you're a Republican or a Democrat.

If you wanted to show how and why fact is ditched in favor of motivated reasoning, you could find no better test case than climate change. After all, it's an issue where you have highly technical information on one hand and very strong beliefs on the other. And sure enough, one key predictor of whether you accept the science of global warming is whether you're a Republican or a Democrat. The two groups have been growing more divided in their views about the topic, even as the science becomes more unequivocal.

So perhaps it should come as no surprise that more education doesn't budge Republican views. On the contrary: In a 2008 Pew survey, for instance, only 19 percent of college-educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans. In other words, a higher education correlated with an increased likelihood of denying the science on the issue. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of the science.

Other studies have shown a similar effect: Republicans who think they understand the global warming issue best are least concerned about it; and among Republicans and those with higher levels of distrust of science in general, learning more about the issue doesn't increase one's concern about it. What's going on here? Well, according to Charles Taber and Milton Lodge of Stony Brook, one insidious aspect of motivated reasoning is that political sophisticates are prone to be more biased than those who know less about the issues. "People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand," says Lodge. "But if they're sophisticated, they can go one step further and start coming up with counterarguments." These individuals are just as emotionally driven and biased as the rest of us, but they're able to generate more and better reasons to explain why they're right—and so their minds become harder to change.

That may be why the selectively quoted emails of Climategate were so quickly and easily seized upon by partisans as evidence of scandal. Cherry-picking is precisely the sort of behavior you would expect motivated reasoners to engage in to bolster their views—and whatever you may think about Climategate, the emails were a rich trove of new information upon which to impose one's ideology.

Page 3 of 4