Believing What You Want to Believe Word
Fact or opinion?
It's a distinction we learn as kids. But it turns out judging facts isn't nearly as black-and-white as your third-grade teacher might have had you believe.
In reality, we rely on a biased set of cognitive processes to arrive at a given conclusion or belief. This natural tendency to cherry pick and twist the facts to fit with our existing beliefs is known as motivated reasoning—and we all do it.
"Motivated reasoning is a pervasive tendency of human cognition," says Peter Ditto, PhD, a social psychologist at the University of California, Irvine, who studies how motivation, emotion and intuition influence judgment. "People are capable of being thoughtful and rational, but our wishes, hopes, fears and motivations often tip the scales to make us more likely to accept something as true if it supports what we want to believe."
In today's era of polarized politics—and when facts themselves are under attack—understanding this inclination (and finding ways to sidestep it) has taken on new urgency, psychologists say.
Red facts and blue facts
Much of the early research on motivated reasoning showed that people weigh facts differently when those facts are personally threatening. More than two decades ago, Ditto and David F. Lopez, PhD, compared study participants who received either favorable or unfavorable medical tests results. People who were told they'd tested positive for a (fictitious) enzyme linked to pancreatic disorders were more likely to rate the test as less accurate, cite more explanations to discount the results and request a second opinion (Journal of Personality and Social Psychology, 1992).
"It takes more information to make you believe something you don't want to believe than something you do," Ditto says.
We don't just delude ourselves when it comes to our health and well-being. Research shows we also interpret facts differently if they challenge our personal beliefs, group identity or moral values. "In modern media terms, that might mean a person is quick to share a political article on social media if it supports their beliefs, but is more likely to fact-check the story if it doesn't," Ditto says.
For instance, Ditto and his former student Brittany Liu, PhD, have shown the link between people's moral convictions and their assessment of facts. They found people who were morally opposed to condom education, for example, were less likely to believe that condoms were effective at preventing pregnancy and sexually transmitted diseases. Similarly, people who had moral qualms about capital punishment were less likely to believe it was an effective way to deter crime (Social Psychology and Personality Science, 2012). "People blur the line between moral and factual judgments," Ditto explains.
For people who identify strongly with one side of the political spectrum or the other, it can feel like their opponents are willfully ignoring the facts. But right or left, both sides believe their positions are grounded in evidence, Ditto says. "We now live in a world where there are red facts and blue facts, and I believe these biased motivated-reasoning processes fuel political conflict. If someone firmly believes some fact to be true that you just as firmly believe to be false, it is hard for either of you not to see that other person as stupid, disingenuous or both."
In an analysis presented at the 2015 annual meeting of the Association for Psychological Science, he and colleagues examined 41 experimental studies of partisan bias involving more than 12,000 participants. They found that self-identified conservatives and liberals both showed a robust partisan bias when evaluating empirical evidence, to an almost identical degree. "It's an equal-opportunity bias," he says.
That bias is unsurprising given the powerful social incentives for group-think, says Daniel Kahan, JD, a professor of law and psychology at Yale Law School who studies risk perception, science communication and the application of decision science to law and policymaking. Consider climate change. Discounting the evidence of human-caused global warming has become a central feature of the conservative platform—and taking an opposing viewpoint could damage your reputation within that group.
"If you take an ordinary member of the public, his or her carbon footprint is too small to make an effect on climate change. If they make a mistake on the science in that part of their life, nothing bad happens to them," Kahan explains. "But they can be adversely affected if they're holding a deviant view on an identity-defining issue inside their social group."
So, consciously or not, people may twist the facts. They can even trick themselves into believing that the facts aren't relevant, as social psychologist Troy Campbell, PhD, an assistant professor of marketing at the University of Oregon, and colleagues have shown.
His team presented volunteers who either supported or opposed same-sex marriage with alleged "facts" suggesting children raised by same-sex parents did or did not experience negative outcomes. When the evidence was on their side, participants stated their opinions on the matter were based in fact. But when the evidence opposed their view, they argued the question wasn't about facts, but morals (Journal of Personality and Social Psychology, 2015). "People take flight from facts," Campbell says.
The more you know
People often dismiss those who hold opposing views as idiots (or worse). Yet highly educated people are just as likely to make biased judgments—and they might actually do it more often.
In one example of this "expertise paradox," Kahan and colleagues asked volunteers to analyze a small data set. First, they showed data that purportedly demonstrated the effectiveness of a cream for treating skin rash. Unsurprisingly, people who had a greater ability to use quantitative information did better at analyzing the data.
But there was a twist. When participants saw the very same numbers, but were told they came from a study of a gun-control ban, their political views affected how accurately they interpreted the results. And those who were more quantitatively skilled actually showed the most polarized responses. In other words, expertise magnified the tendency to engage in politically motivated reasoning (Behavioural Public Policy, in press). "As people become more proficient in critical reasoning, they become more vehement about the alignment of the facts with their group's position," Kahan says.
The pattern holds up outside the lab as well. In a national survey, Kahan and colleagues found that overall, people who were more scientifically literate were slightly less likely to see climate change as a serious threat. And the more they knew, the more polarized they were: Conservatives became more dismissive of climate change evidence, and liberals became more concerned about the evidence, as science literacy and quantitative skills increased (Nature Climate Change, 2012).
"It's almost as though the sophisticated approach to science gives people more tools to curate their own sense of reality," says Matthew Hornsey, PhD, a professor of psychology at the University of Queensland who studies the processes that influence people to accept or reject scientific messages.
Unfortunately, our modern media landscape seems to be amplifying the retreat from facts. "These are wonderful times for motivated reasoners. The internet provides an almost infinite number of sources of information from which to choose your preferred reality," says Hornsey. "There's an echo chamber out there for everyone."
Compounding the problem, fake-news websites that publish hoaxes, conspiracy theories and disinformation disguised as news have proliferated in recent years. But the recent focus on fake news might be doing more harm than good, some experts say. "Now that we have this idea that there is fake news, we can credibly attribute anything we dislike to fake news," says Campbell.
In the past, climate-change skeptics might have tried to pick apart the details of a study or demonstrate a researcher's conflict of interest to cast doubt on the evidence. Now, they can simply allege that the media can't be trusted to report the truth, and wipe away inconvenient facts with a single stroke. "Mistrust of the media is a powerful tool for motivated reasoning," says Ditto.
License to ignore reality is a dangerous path to travel, regardless of your political leanings, Kahan adds. "It's a good thing in our political culture that facts have been the currency of our discourse on disputed issues. If facts are somehow devalued as a currency, it'll be a lot harder to achieve our common goals."
The root of the problem
What can be done to restore our faith in facts?
Media literacy is one place to start. A report by researchers from Stanford University's Graduate School of Education found students in middle school, high school and college were terrible at evaluating the quality of online information (Stanford History Education Group, 2016). Though the authors described their findings as "bleak" and "dismaying," the silver lining is that kids can be taught to be better consumers of information—by, for instance, learning to pay closer attention to the source, consider possible biases or motives and think about what details a news source might have left out.
But given our cognitive biases, teaching can only get us so far. "Motivated reasoning is not something that's open to view through introspection or conscious effort," says Kahan. "I'd put more hope on a strategy of improving the science communication environment."
That's where Hornsey is focusing his efforts. In a new paper, he describes what he calls attitude roots—the fears, ideologies, worldviews, vested interests and identity needs—that motivate us to accept or reject scientific evidence. He argues that communicators must do a better job at identifying those roots and adjust their persuasion attempts accordingly (American Psychologist, in press). "This is what we call jiu jitsu persuasion: working with people's motivations rather than trying to fight against them," he says.
So, for example, want to convince a vaccine skeptic that immunizations are safe? First it helps to figure out if they believe in Big-Pharma conspiracy theories, if they're fearful of medical intervention or whether they want to prove to their social circle that they're a concerned parent.
"The key question is not 'Why do they disagree with the science?' but rather, 'Why do they want to disagree with the science?'" Hornsey says.
Answering that will probably require doing something people in our increasingly polarized political climate are loathe to do: Less talking, more listening.
People communicating the facts often do so with the implication that the target is a bad person at worst, or uneducated at best, Campbell says. But an adversarial approach isn't likely to change minds.
That's a lesson cosmetics companies learned long ago: They figured out they'll sell more lipstick if they promise to enhance a woman's natural beauty rather than tell her she's ugly, Campbell points out. People who communicate information would do well to heed that example. That goes for scientists and science communicators, but also for anyone who can share an article with hundreds of people with the click of a button—which is to say, almost everyone in today's digital landscape.
"One of the most important ways to inoculate people from false information is to befriend them," Campbell says. "There's a time for the middle finger, and a time to put it away."
Believing What You Want to Believe Word
Source: https://www.apa.org/monitor/2017/05/alternative-facts