Are We Hard-Wired to Doubt Science?
In this New York Times article, the author takes a look at the latest research on how people are wired to reject and ignore that environmental contaminants are causing harm to them and the planet, even when there is scientific proof tying this cause and effect. The article features the theory that humans will always be "subjective analyzers of data" and have an inherent adverse reaction to anything that "is out to get them," meaning that in the case of environmental contaminants, people are so frightened of grasping the gravity of the harmful effects that they are able to discard their cognitive reasoning of the science on the subject.
Posted on New York Times
February 1, 2011
Are We Hard-Wired to Doubt Science?By Felicity Barringer
In researching Monday’s article about opposition to smart meters, I found myself once again facing a dilemma built into environmental reporting: how to evaluate whether claims of health effects caused by some environmental contaminant — chemicals, noise, radiation, whatever — are potentially valid? I turned, as usual, to the peer-reviewed science.
But some very intelligent people I interviewed had little use for the existing (if sparse) science. How, in a rational society, does one understand those who reject science, a common touchstone of what is real and verifiable?
The absence of scientific evidence doesn’t dissuade those who believe childhood vaccines are linked to autism, or those who believe their headaches, dizziness and other symptoms are caused by cellphones and smart meters. And the presence of large amounts of scientific evidence doesn’t convince those who reject the idea that human activities are disrupting the climate.
What gives? A recovering journalist, David Ropeik, who is an instructor at the Harvard University extension school and the author of a book, “How Risky Is It Really?” offers one explanation.
He uses peer-reviewed science to explain the limits of peer-reviewed science as a persuasive tool.
Humans, he argues, are hard-wired to reject scientific conclusions that run counter to their instinctive belief that someone or something is out to get them.
Here, slightly edited and condensed, is Mr. Ropeik’s explanation of the role of neuroscience, psychology and anthropology in creating this societal cognitive dissonance about peer-reviewed science. (Or, as my colleague Andrew Revkin says, why we have “inconvenient minds.”)
The assumption that there is a single truth to know that the scientific method can bring us — or a useful truth we will all ascribe to — overlooks large bodies of science that show that there is no such thing as a fact. We are subjective analyzers of data. You and I and everybody there in that story will look at “the facts” — no matter how peer-reviewed and scientifically robust they may be —through the lenses that evolution has given us for our survival.
First, the way information comes into and is processed by the brain is part of this. It’s processed sooner by the amygdala, where fear starts, than the cortex, the seat of reason. We are hard-wired to respond to external or internal information with emotion and instinct first and cognition second. With emotion and instinct more and reason less.
In the case of radiation, it is invisible. It is a risk that we have no information to immediately use to protect ourselves. Look at the risks that are scary — chemicals, pesticides, radiation — we are uncertain and it scares us because have less control over what we can’t detect. Even if you have a Geiger counter, you would see this information and it would still be partial. Most of us would still be a couple of degrees short of knowing what it meant. Meanwhile, your amygdala is screaming: Alert! Alert!
Second, there is the time element when it comes to being averse to loss. If a risk is down the road, we see it with rose-colored glasses. “It won’t happen to me.” This means people like smokers, or cellphone-using drivers, or people in Manhattan about something like 9/11. But when something is more immediate, the risk side of the equation carries more weight.
Third — and this is the cutting edge field of research into risk perception — we tend to identify in four major groups over how we want society to be organized and to operate. You and I tend to conform our opinions about the validity of science to match what would be consistent with how our tribe operates.
Two of the groups involved, he said, are simply characterized: individualists (most people would call them libertarians, who want the government to butt out) and communitarians, the two poles on the political spectrum. The two other groups, he said, are called hierarchists and egalitarians. “Hierarchists like the status quo, class, old money,” he said. “They like a nice predictable social ladder with rungs on the ladder. Egalitarians don’t want any rungs.”
Based on their remarks, he said, some of the smart-meter opponents are a blend of egalitarian and communitarian. “They don’t like new technology,” he explained, and they are bothered by an economic status quo that produces things like smart meters.
“They believe that society would be better if it stood up more to the hierarchist status quo,” he said. “When something that represents that status quo comes along, there is a cultural resistance to it. That is the underlying cultural reason they will cherry-pick their symptoms and the facts into their ostensibly rational argument against smart meters.”
The science on which Mr. Ropeik bases his conclusions includes the work of Joseph LeDoux, a neuroscientist at New York University and the author of “The Emotional Brain: The Mysterious Underpinnings of Emotional Life,” and of Paul Whalen, an associate professor of psychology and brain science at Dartmouth who maintains a Web site called “The Whalen Lab, An Affective-Cognitive Neuroscience Laboratory.”
The literature on the psychology of risk perception is cited in a chapter of Mr. Ropeik’s book.
Now, why do I think that not everyone is going to agree with him… ?