Facebook’s ability to create filter bubbles, promote divisive content, and accelerate political polarization is no surprise to users who’ve kept up with the platform's many scandals. But two new studies point to pitfalls with commonly proposed solutions and point to a troubling double bind for the 190 million Americans who rely on Facebook for news.
Encouraging users to sample other news sources, one study finds, can instead cause users to double down on their beliefs. Another found that logging off entirely reduces polarization but leaves people politically disengaged and disinterested. Sixteen years in, we’re still only beginning to understand how Facebook is shaping us.
Internally, the company has begun to quietly acknowledge the trade-off for users between staying informed and being algorithmically driven toward divisive content. But little has been done. Politicians remain focused on accusations of bias, and Facebook is busy proving it treats conservative and liberal users impartially. Executives routinely emphasize the matching Republican and Democratic criticism of the platform. If neither side is happy, they say, neither is being favored.
But conservative and liberal users have very different experiences when using Facebook. That’s not because of politically motivated decisions around what’s allowed on the platform. Rather, it reflects the way Facebook organizes information to reward “engaging” content. The focus on ferreting out bias obscures this and makes even practical solutions seem implausible.
While there’s little evidence to support that Facebook is biased against conservative users, University of Virginia professors Brent Kitchens and Steven Johnson found that, by maximizing for engagement and attention, Facebook’s algorithms actively push conservatives toward more radical content than liberal users. Kitchens and Johnson analyzed the news habits of over 200,000 users who agreed to share their browsing data. They found that Facebook pushed conservatives, unlike its moderate or liberal users, to read dramatically more radical content over time.
“All the platforms end up providing a sort of diversifying effect,” explains Kitchens, associate director of Virginia’s Center for Business Analytics. “If you're reading more news from Facebook, you're going to get a wider variety of news. But there's also a polarizing effect. The diversity of information gets a little wider, but it also shifts more extreme.”
The study compared respondents’ use of Facebook, Reddit, and Twitter, with their news habits. Kitchens and Johnson created a numbered political spectrum of 177 news sites, with DailyKos and Salon furthest left, Breitbart and InfoWars furthest right, and USA Today around the center. In the months when conservative users were most active on Facebook, they read news sites that were far more conservative than their average, clicking links from InfoWars and Breitbart over staples like Fox News. By contrast, news consumption by liberal users shifted far less dramatically on the authors' scale.
This polarizing effect of Facebook is in stark contrast to Reddit. When conservative users were most active on Reddit, they actually shifted to news sites the authors judged as more moderate than what they typically read. Kitchens and Johnson hypothesize that the most salient differences between Facebook and Reddit aren’t the content itself but how platforms structure and feed news and information to users.
“The impacts we’re seeing are by design. Facebook knows what's going on with its platform,” says Johnson. “If it wanted to change it, it could.”
The authors identified a few major differences between Facebook and other sites. First, Facebook requires reciprocal friendship, which encourages a feed of like-minded people and reduces the chance of seeing opinion-challenging content. Facebook’s algorithms create feedback loops that perpetually show users what it thinks they want to see.
Second, Reddit has more anonymity than Facebook. Because users don’t necessarily have reciprocal bonds, people with different views can gather and share links in the same thread. Reddit’s algorithms prioritize interests, not friendship, and in the course of interactions on nonpartisan topics, the authors say there’s a much higher likelihood users will come across links to sites outside their typical news diet.
Filter bubbles and echo chambers are well researched territory, but the Virginia paper argues against popular understandings of the topics and, tellingly, against diversifying news consumption as a means to mitigate polarization. A liberal may diversify their news source by reading 20 new news sites for the first time. But if they are similar to the person’s current beliefs or further left, it won’t have a moderating effect.
Additionally, disrupting a person’s feed with occasional posts outside their usual beliefs can actually reinforce their initial point of view. The most politically active users would routinely engage with content on the other “side,” if only to dismiss it. (In journalist parlance, this is called “hate sharing.” A liberal blog may go viral among conservatives because of how strongly readers disagree with it or vice versa. Poetically, the paper terms this a “phenomenon of animosity.”)
By Nitasha Tiku
The Virginia study allows other potential influences on news diet—no single study can uncover them all—but the professors make a clear argument. Individual decisions around conservative or liberal content are secondary to how the platforms assemble themselves to maximize and monetize engagement.
“I think what’s getting lost is that the algorithms prioritizing what you see are far, far more impactful ultimately than just a few high-profile cases on the boundary,” Johnson says.
There are no easy fixes, including foregoing Facebook entirely.
In a separate study, a team of researchers at Stanford University paid roughly 2,700 respondents to deactivate Facebook in the weeks surrounding the 2018 midterm elections. Respondents were sent around $108 and completed daily surveys from the researchers about their mental well being, political views, and ability to stay informed without relying on Facebook as a news source. (Facebook itself is researching sending money to some users to deactivate.)
The researchers found “small but significant” increases in self-reported happiness and decreases in anxiety. Many who deactivated reported that they used Facebook less after the study ended and they had returned to the platform; 5 percent of participants hadn’t reactivated the service two months later. On average, they reported 60 more minutes of free time each day.
But many who relied primarily on Facebook for news remained unaware of current events. The team found that quitting the platform “significantly reduced” news knowledge. When quizzed on factual news events, those in the “quit” group answered correctly less often than those in the control group. Participants were less exposed to polarizing news and held less polarizing views on specific issues but maintained strong opposition to the other party.
Facebook is an important and, research suggests, hard-to-replace political player. Users are enraged, informed, misinformed, but ultimately trapped in its orbit. Compounding the problem, politicians use high-profile outliers to uncover “bias,” then butt their heads, repeatedly, against the paradox of forcing something as inherently political as Facebook to be impartial. Virginia’s Johnson said the best idea to come from Wednesday’s Senate hearing on social media was a proposal from Twitter CEO Jack Dorsey. When discussing the need for platforms to be more transparent, Dorsey suggested having an outside group create algorithms that users could then select for themselves. Imagine being able to tell Facebook to optimize for something other than engagement—education quality, trustworthiness, locality, etc.
“I have no idea the viability of whether it would be likely to happen,” Johnson says. “But I do think that's definitely the right kind of conversation.”