CAMBRIDGE, MASSACHUSETTS – Why does misinformation spread so quickly on the social media? Why doesn’t it get corrected? When the truth is so easy to find, why do people accept falsehoods?
A new study focusing on Facebook users provides strong evidence that the explanation is confirmation bias: people’s tendency to seek out information that confirms their beliefs, and to ignore contrary information.
Confirmation bias turns out to play a pivotal role in the creation of online echo chambers. This finding bears on a wide range of issues, including the current presidential campaign, the acceptance of conspiracy theories and competing positions in international disputes.
The new study, led by Michela Del Vicario of Italy’s Laboratory of Computational Social Science, explores the behavior of Facebook users from 2010 to 2014. One of the study’s goals was to test a question that continues to be sharply disputed: When people are online, do they encounter opposing views, or do they create the virtual equivalent of gated communities?
Del Vicario and her coauthors explored how Facebook users spread conspiracy theories (using 32 public web pages); science news (using 35 such pages); and “trolls,” which intentionally spread false information (using two web pages). Their data set is massive: It covers all Facebook posts during the five-year period. They explored which Facebook users linked to one or more of the 69 web pages, and whether they learned about those links from their Facebook friends.
In sum, the researchers find a lot of communities of like-minded people. Even if they are baseless, conspiracy theories spread rapidly within such communities.
More generally, Facebook users tended to choose and share stories containing messages they accept, and to neglect those they reject. If a story fits with what people already believe, they are far more likely to be interested in it and thus to spread it.
As Del Vicario and her coauthors put it, “users mostly tend to select and share content according to a specific narrative and to ignore the rest.” On Facebook, the result is the formation of a lot of “homogeneous, polarized clusters.” Within those clusters, new information moves quickly among friends (often in just a few hours).
The consequence is the “proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.” And while the study focuses on Facebook users, there is little doubt that something similar happens on other social media, such as Twitter — and in the real world as well.
Striking though their findings are, Del Vicario and her coauthors do not mention the important phenomenon of “group polarization,” which means that when like-minded people speak with one another, they tend to end up thinking a more extreme version of what they originally believed. Whenever people spread misinformation within homogenous clusters, they also intensify one another’s commitment to that misinformation.
Of the various explanations for group polarization, the most relevant involves a potentially insidious effect of confirmation itself. Once people discover that others agree with them, they become more confident — and then more extreme.
In that sense, confirmation bias is self-reinforcing, producing a vicious spiral. If people begin with a certain belief, and find information that confirms it, they will intensify their commitment to that very belief, thus strengthening their bias.
Suppose, for example, that you think an increase in the minimum wage is a sensational idea, that the nuclear deal with Iran is a mistake, that Obamacare is working well, that Republican candidate Donald Trump would be a fine president, or that the problem of climate change is greatly overstated. Arriving at these judgments on your own, you might well hold them tentatively and with a fair degree of humility. But after you learn that a lot of people agree with you, you are likely to end up with much greater certainty — and perhaps real disdain for people who do not see things as you do.
On the basis of all the clustering, that almost certainly happened on Facebook. Strong support for this conclusion comes from research from the same academic team, which finds that on Facebook, efforts to debunk false beliefs are typically ignored — and when people pay attention to them, they often strengthen their commitment to the debunked beliefs.
Can anything be done? The best solution is to promote a culture of humility and openness. Some people, and some communities, hold their own views tentatively; they are interested in refutation, not just confirmation. Moreover, those who manage social media (such as Google) can take steps to allow people to assess the trustworthiness of what they are seeing, though these efforts might be controversial and remain in a preliminary state.
In the midst of World War II, a great federal judge, Learned Hand, said that the spirit of liberty is “that spirit which is not too sure that it is right.” Users of the social media are certainly exercising their liberty. But there is a real risk that when they fall prey to confirmation bias, they end up compromising liberty’s spirit — and dead wrong to boot.
Cass R. Sunstein, the former administrator of the White House Office of Information and Regulatory Affairs, is the Robert Walmsley university professor at Harvard Law School and a Bloomberg View columnist.
By subscribing, you can help us get the story right.