How To Resist Motivated Reasoning11-28-2020
Some things, Julie Beck argues, are more important than truth. Hannah Arendt says something similar, arguing that thinking is concerned not with truth, but with meaning. It is meaning, not truth, that Arendt holds to be the basic human need. That is why for Arendt, the most basic of human rights is the right to have rights, the right to speak and act in a political world so that one is meaningful.
For Beck, the desire to belong to a meaningful group, leads to cognitive dissonance.
The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.
Beck seeks to explain modern partisanship by looking to theories of cognitive dissonance and motivated reasoning, both of which help to explain why it is so difficult to convince people of facts they do not want to believe. Members of cults will defend false beliefs because the benefits of membership in the cult outweigh the benefits of acknowledging reality. What is surprising, Beck argues, is that such cult-like loyalty to group membership over reality has become tied to membership in political parties. And political party partisanship is greater now than at any time since 1879. She writes:
Partisanship has surely ramped up—but Americans have been partisan before, to the point of civil war. Today’s media environment is certainly unique, though it’s following some classic patterns. This is hardly the first time there have been partisan publications, or many competing outlets, or even information silos. People often despair at the loss of the mid-20th-century model, when just a few newspapers and TV channels fed people most of their unbiased news vegetables. But in the 19th century, papers were known for competing for eyeballs with sensational headlines, and in the time of the Founding Fathers, Federalist and Republican papers were constantly sniping at each other. In times when communication wasn’t as easy as it is now, news was more local—you could say people were in geographical information silos. The mid-20th-century “mainstream media” was an anomaly.
The situation now is in some ways a return to the bad old days of bias and silos and competition, “but it’s like a supercharged return,” Manjoo says. “It’s not just that I’m reading news that confirms my beliefs, but I’m sharing it and friending other people, and that affects their media. I think it’s less important what a news story says than what your friend says about the news story.” These silos are also no longer geographical, but ideological and thus less diverse. A recent study in the Proceedings of the National Academy of Sciences that analyzed 376 million Facebook users’ interactions with 900 news outlets reports that “selective exposure drives news consumption.”
The question everyone is asking is: How can we reduce the record levels of cognitive dissonance and motivated reasoning? Famously, Hannah Arendt’s answer is that it is possible that thinking—the practice of stopping to think and to think from the perspective of others—might be the only reliable way to interrupt the seductions of mass movements and totalitarianism. But how do we teach thinking? Beck suggests a few approaches.
“This is why we need to teach critical thinking, and this is why we need to push back against false beliefs, because there are some people who are still redeemable, who haven’t made that full slide into denialism yet. I think once they’ve hit denial, they’re too far gone and there’s not a lot you can do to save them.”
There are small things that could help. One recent study suggests that people can be “inoculated” against misinformation. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later. And in the study at least, it worked.
While there’s no erasing humans’ tribal tendencies, muddying the waters of partisanship could make people more open to changing their minds. “We know people are less biased if they see that policies are supported by a mix of people from each party,” Jerit says. “It doesn’t seem like that’s very likely to happen in this contemporary period, but even to the extent that they see within party disagreement, I think that is meaningful. Anything that's breaking this pattern where you see these two parties acting as homogeneous blocks, there’s evidence that motivated reasoning decreases in these contexts.”
Asking a similar question about how to address extreme cognitive dissonance, David Brooks argues that the only way to address the loss of trust in factual evidence is the long process of rebuilding trust.
What to do? You can’t argue people out of paranoia. If you try to point out factual errors, you only entrench false belief. The only solution is to reduce the distrust and anxiety that is the seedbed of this thinking. That can only be done first by contact, reducing the social chasm between the members of the epistemic regime and those who feel so alienated from it. And second, it can be done by policy, by making life more secure for those without a college degree.
Rebuilding trust is, obviously, the work of a generation.