You are using an outdated browser.
Please upgrade your browser
and improve your visit to our site.
Skip Navigation

This Is How You Should Talk to a Climate-Change Denier

The complicated science of discussing risk

If you read the non-enthralling 36-page summary of a report about to come out from the U.N. Intergovernmental Panel on Climate Change, as I did last week, you may have been struck by something faintly odd. Most news stories interpreted the communiqué as the strongest statement yet from the world’s top climate experts—more than 800 from 39 countries—that human-induced global warming will irreversibly transmogrify earth, sea, and sky unless carbon emissions are capped soon. The reporters weren’t wrong, but they missed something, perhaps because it was so obvious it slipped beneath notice. It was the tone. The document made heavy use of italics (“virtually certain,” “very likely”) and language (“unequivocal,” “unprecedented”) not usually stooped to by scientists. The writers focused on their “confidence in the validity” of their findings rather than on the findings themselves. In short, this was no neutral act of scientific communication. It was a rebuttal to those who have done nothing to mitigate the risks of climate change, because they refuse to admit its plain facts.

Why do people persist in underestimating risk in the face of a clear scientific consensus? I don’t think I exaggerate when I say that lack of belief in climate change is the main obstacle to keeping the planet a place on which humans and other species can live comfortably. Do the skeptics (roughly a third of all Americans) act in bad faith, or do they just think badly? Are they cynical, ill-educated, innumerate, distrustful of science, or pawns of the merchants of doubt? Do they watch too much Fox News, whose website, predictably, called the report “an embarrassment, self-serving and beyond misleading”?

The conundrum encompasses more than global warming. Scientists, politicians, and regulators are constantly trying to figure out which risks will alarm people. Why is bioengineering, the crossbreeding of biology with technology, scary, but nanotechnology, the manipulation of tiny particles that can change the fundamental properties of things, not scary—at least not so far?

Why did we stop holding protests about nuclear power? Why was the human papillomavirus vaccine anathematized when given to teenage girls, even though we’ve been giving another vaccine for a sexually transmitted disease—hepatitis B—to infants for years?

Clearly, scientists can’t manage public reactions to risk as well as they’d like. History has made fools of environmentalists who believed that people would change their behavior if they only knew how bad global warming could be. Some psychologists, such as Daniel Kahneman, say that most people just aren’t very good at deliberation; they process difficult scientific material too quickly and emotionally to estimate risk accurately. Another hypothesis targets the right-wing personality, claiming that it is close-minded, complexity-averse, and prone to forcing all evidence into a partisan framework.

Lately, though, an explanation of flawed risk assessment has become popular that has the virtue of not dismissing half the population as ignorant, stupid, or biased by temperament. It’s called the cultural cognition of risk, and its best-known advocate is a handsome, puckish law and psychology professor at Yale named Dan Kahan. He thinks misperceptions of risk make perfect sense if you view them in a social context. First, says Kahan, rid yourself of the thought that Americans distrust science, scientists, or the scientific consensus. Consider this line from a 2009 Pew Research report: “Overwhelming majorities say that science has had a positive effect on society and that science has made life easier for most people.” What they disagree about is what the science says. People assimilate the data and choose the experts that fit most neatly with their and their peers’ values. Kahan and colleagues like to plot these worldviews on charts that illustrate how certain sociocultural tendencies—hierarchical-individualist and egalitarian-communitarian are the two big ones—consistently correlate to the same judgments about what’s worrisome and what isn’t.

Risk assessment by groupthink is reasonable, if not rational, because, at the personal level, it costs nothing. If you misconstrue the nature of a global threat, your mistake won’t hurt you much, because you can’t save yourself anyway. But if you contradict your friends or powerful members of your group—that could cost you dearly. (Incidentally, Kahan sees evidence of scientific groupthink on both sides of the ideological spectrum.) Kahan’s most provocative finding, though, is that people better at “cognitive reflection,” or slow, probing thought, are actually more likely to arrive at predetermined conclusions about risk, not less. The urge to maintain status within one’s social network is so powerful, Kahan told me, that well-educated people will use their information-gathering and computational skills to marshal a more impressive body of evidence in support of whatever identity it is (freethinking skeptic, caring mother hen) that earns them brownie points in their troop. On his blog, he once called these strong in-group effects “tapeworms of cognitive illiberalism” and a dispiriting omen for democracy.

WHAT LOOKS RISKY TO YOU?
Cultural theorists think that what people perceive as a threat is a function of their worldview. The chart above shows which risks go with which predisposition.

Kahan’s conclusions wouldn’t surprise anyone who has spent her life studying politics. Political theorists have been working out for decades how the interests of individuals and small groups obstruct the interests of the collective and why it’s so hard to get people to act on threats they can’t see or feel. According to Kahan, what cultural cognition theory has to add is the science of science communication. He and his kind conduct research on how to present science so that it won’t be entangled with issues of “membership and loyalty to a group.”

So what would he tell the environmentalists? The key is “not to use language or modes of communication that convey animosity, contempt, and hostility,” he told me, because then “the signal that will come through is ... that our group is under assault.” I wasn’t able to force any good examples out of him, though I couldn’t help suspecting that one tactic would be to play down the words “United Nations,” since that body, as everybody knows, flies black helicopters and encroaches on U.S. sovereignty. But I’d guess that Harvard political scientist and sociologist Theda Skocpol, for one, wouldn’t think Kahan could tell the environmentalists much. In a recent and devastating post-mortem on the failed attempt to pass cap-and-trade legislation in President Obama’s first term, she was contemptuous of the kind of politics that relies on “messaging campaigns,” as she calls them. “The new vogue to pay psychological researchers to come up with phrases that subliminally appeal to individuals,” she wrote, is “a waste of resources.” She’d rather have green groups put their money into boots-on-the-ground, precinct-by-precinct, Tea Party–style organizing. I read Kahan that passage and he countered by asking what message Skocpol’s organizers would offer up, and how they’d know whether it would work if they didn’t test it with “evidence-based methods.”

Skocpol may have a message—she prefers cap-and-dividend to cap-and-trade, since those dividends mean money in voters’ pockets—and, presumably, she’d know she was right if that legislation passed. But Kahan and Skocpol aren’t as far apart as they seem. Both consider group identity as important, politically, as individual acts of cognition, and both know that groups are not static. If you need a group’s approval to get your warnings heard, then your job is to get that group on your side, whether by sloganeering or knocking on doors or—preferably—both. Sometimes democracy is less a matter of thinking well than of choosing your friends wisely.

Judith Shulevitz is the science editor of The New Republic.