A matter of your biology, values and tribe

We view ourselves as rational, thinking beings. We consider facts, evidence and argument, weigh the pros and cons, and make informed, reasonable judgments.

At least that’s what we think we do. If it were true, why do topics that supposedly can be proven or disproven by science – for example, whether to add fluoride to our water supply to protect our teeth, or whether to immunize our children – divide communities?

And those are the easy issues,compared with the debates surrounding climate change, evolution or whether homosexuality is innate or is a choice.

Are people uninformed? Have scientists and other sources of knowledge and authority lost their credibility? Are religious values trumping scientific findings?

Or, is something more complicated going on?

While disagreements over these and other topics often seem irreconcilable, it turns out that recent
research in neuroscience, social science and other fields offers new insights on how we process information and form our beliefs and judgments. Anyone involved in communications, marketing, or other activities designed to inform or persuade, if they are to understand their audiences, must take note of these findings and their implications.

A rush of blood to the forehead

An Associated Press-GfK poll designed to be representative of the U.S. population and released in March illustrated the current level of public acceptance of scientific assertions. About four in 10 Americans are not too confident or outright disbelieve that the earth is warming as a result of human-caused heat-trapping gases; that earth is 4.5 billion years old; or that life on earth evolved through a process of natural selection. Results also indicated that 51 percent of Americans question the Big Bang theory.

Some of the nation’s top scientists reacted to the survey’s results with dismay or even dismissal. The chief executive of the American Association for the Advancement of Science said that, when values and beliefs conflict with science, “most often values and belief trump science.” The winner of the 2013 Nobel Prize in medicine added, “Science ignorance is pervasive in our society.”

Such comments, however, are not only dismissive but, in light of recent research, an oversimplification. In The Science of Why We Don’t Believe Science” (2011) reporter Chris Mooney, who has written extensively on how the public views such issues, wrote that modern neuroscience has shown that reason is actually suffused with emotion. Positive or negative feelings about people, things and ideas arise much more rapidly than our conscious thoughts, “in a matter of milliseconds.” He writes that evolution, in giving us basic survival skills, has trained us to react quickly to stimuli in the environment. “We apply fight-or-flight reflexes not only to predators, but to data itself,” Mooney notes.

Neuroscience was also the basis for research conducted in 2003 by University of Maryland psychologist Kevin Dunbar and discussed in the New Yorker in 2006. Dunbar showed videos – first, to non-physics majors, and then to physics majors – based on the experiment conducted by Galileo in which objects of two different weights were dropped from the same height.

The conventional wisdom was that the heavier object would fall at a faster speed; that was portrayed in one video, which was selected as the correct answer by the non-physics majors. The actual experiment, of course, demonstrated that the two objects fell at the same speed, portrayed in a second video. When told that video was the correct one, the non-physics majors experienced a brain activation consisting of a squirt of blood to a collar of tissue located in the center of the brain. Neuroscientists have long known of this response when a person sees something that seems wrong, even if it’s right.

The more interesting finding came when Dunbar showed the videos to physics majors, who knew the right answer. Even so, when they selected the correct video, blood flow increased to their brains’ dorsolateral prefrontal cortex, located just behind the forehead. This biological activity is important in suppressing so-called unwanted representations, getting rid of thoughts that aren’t useful. In other words, Dunbar explained, even though they knew the right answer, the physics majors’ brains had to do additional work, suppressing intuition. In being presented information that challenges innate beliefs, however, many don’t emulate the physics majors and undertake that extra mental labor.

Not scientists, but lawyers

Mooney also cites Stony Brook University political scientist Charles Taber, whose research indicated that when a person hears about a scientific discovery that challenges a long-held belief, a subconscious negative response to the information guides the types of information and memories formed in the conscious mind. The person, Taber said, naturally retrieves thoughts that are consistent with previous beliefs, so that they can build an argument and challenge what they’re hearing.

As a result, Mooney observes, when we think we’re reasoning, we may instead be rationalizing. He quotes an analogy offered by University of Virginia psychologist Jonathan Haidt: “We may think we’re being scientists, but we’re actually being lawyers.”

Mooney also has documented the work of Yale Law School Professor Dan Kahan, who has shown that the deep-seated views of people about morality and about the way society should be ordered are strong predictors of who they consider to be legitimate scientific experts – and, as a result, where they consider “scientific consensus” to lie on contested issues.

In his research, Kahan classifies individuals as either “individualists” or “communitarians,” and as either “hierarchical” or “egalitarian” in outlook. “Somewhat oversimplifying,” Mooney writes, “think of hierarchical individualists as conservative Republicans and egalitarian communitarians as liberal democrats.”

An example from Kahan’s research: “A friend tells you that he or she is planning to read a book about an issue (e.g., climate change) but would like an opinion on whether the author seems like a knowledgeable and trustworthy expert.” The subject was then shown a fake resume of the expert, a member of the National Academy of Sciences who has earned a Ph.D. in the pertinent field from one elite university who was now on the faculty of another.

The subject was then shown an excerpt of a book by the “expert,” in which the scientist stated that global warming is real and human caused. The result: only 23 percent of hierarchical individualists agreed the person was a “trustworthy and knowledgeable expert,” while 88 percent of egalitarian communitarians did. Similar splits were seen in such issues as whether nuclear waste could be safely stored underground and whether letting people carry guns deters crime.

The bottom line for Kahan: people rejected the validity of a scientific source because its conclusion contradicted their deeply held views. “And that,” Mooney reports, “undercuts the standard notion that the way to persuade people is via evidence and argument. In fact, head-on attempts to persuade can sometimes trigger a backfire effect, where people not only fail to change their minds when confronted with the facts – they may hold their views more tenaciously than ever.”

Ezra Klein, former Washington Post blogger and now editor of the “explanatory journalism” site Vox.com, also interviewed Kahan, for a recent post, How Politics Makes Us Stupid” (April 6, 2014).

Ezra Klein

Describing Kahan’s social science experiments, Klein writes that they’re “designed to test people’s abilities to consider the evidence arrayed before them. It forces subjects to suppress their impulse to go with what looks right and instead do the difficult mental work of figuring out what is right. In Kahan’s sample, most people failed.” In other words, they didn’t allow the necessary the blood flow to the fronts of their brains.

Klein notes that Kahan’s hypothesis is not that people are held back by a lack of knowledge, but that there are some kinds of debates where people don’t want to find the right answer so much as to win the argument. Certainly, Klein quotes Kahan, most of the time people are “perfectly capable” of being convinced by the best evidence; for example, on whether antibiotics work, whether the H1N1 flu is a problem, or whether heavy drinking impairs one’s ability to drive.

“But our reasoning becomes rationalizing when we’re dealing with questions where the answers could threaten our tribe,” Klein writes. “Individuals subconsciously resist factual information that threatens their defining values.”

The continuing question: How to persuade?

How, then, to overcome our internal biology and tribal allegiances to sort out fact from fiction or distortion and address the long-term challenges of society, and civilization? Or, as Mooney asks, “What can be done to counteract human nature itself?”

He answers that question by advising, “If you want someone to accept new evidence make sure to present it to them in a context that doesn’t trigger a defensive, emotional reaction.” He posits, for example, that conservatives might be more likely to be concerned about climate change if that message comes from a business or religious leader, “who can set the issue in the context of different values than those from which environmentalists or scientists often argue.”

Kahan, on the other hand, places more responsibility for communication on these issues with the scientific community. According to Klein, he would like to see researchers develop a more “evidence-based models” of how people often treat questions of science as questions of identity. By doing so, Kahan says, scientists could craft a communications strategy that would avoid biases of their audience.

Still, as divisions continue over these issues, science alone will not provide all the answers. “There’s something charming and roguish about some Americans’ antipathy to authority and refusal to defer to experts,” wrote Bloomberg reporter Christopher Flavelle in March. “But that means the biggest challenge associated with climate change my not be technological, economic or political, but cultural. That’s a puzzle even the smartest scientists haven’t been able to crack.’

Whether in forming political consensus, persuading customers to select a product or gaining support for a public initiative, communicators cannot be satisfied simply to present facts, information and research. We need to understand our audiences’ values and the “tribes” that influence their behavior. That, in and of itself, is nothing new, but these and other studies offer new insights on how to navigate those factors in making new connections.



Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 433 other followers