9 C
New York
Thursday, March 28, 2024

Stop Saying Facebook Is ‘Too Big to Moderate’

On Monday, a new coronavirus disinformation video exploded across the internet. Created by the right-wing site Breitbart, it was a clip of a press conference from a group calling themselves America’s Frontline Doctors containing dangerously false claims about the coronavirus, including that masks are useless and that chloroquine cures the disease. (There is no known cure.) The video was a test of social media platforms’ stated policies against pandemic disinformation, and by some measures they passed. By Tuesday morning, Facebook, Twitter, and YouTube had all taken down the post for violating their policies on false information about treatments and cures for Covid.

For Facebook, the episode could be seen as a particular success. Many people, including the company’s own employees, have argued that it moves too slowly in response to false and harmful posts on the platform. Here, Facebook was the first major platform to act. There was just one problem: The video had already been viewed more than 20 million times by the time Facebook took it down on Monday night, according to NBC News. The horse was miles away before the barn doors were closed.

On the eve of an extremely high-profile congressional hearing on antitrust and competition issues in Big Tech, the episode has revived a common critique of Facebook: that the platform is simply too big to police effectively, even when it has the right policies in place. As The New York Times’ Charlie Warzel put it on Twitter, “facebook cannot manage mis/disinformation at its scale. if videos can spread that widely before the company takes note (as they have time and time again) then there’s no real hope. it’s not a matter of finding a fix – the platform is the problem.”

This is a very popular view, but it doesn’t make a great deal of sense. It’s true that no site that relies on user-generated content, and has millions or billions of users, can ever perfectly enforce its content rules at scale. But in no industry, save perhaps airlines and nuclear power plants, do we suggest that anything short of perfection is equivalent to failure. No one says there are simply too many people in the world to enforce laws at scale; we just employ a ton of cops. (Of course, the protest movement against police violence has powerfully argued that those funds would be better spent elsewhere—a question for another article.) The issue is whether Facebook can get from where it is now—taking so long to crack down on a flagrantly misleading video created by one of its own official news partners that it was already seen by tens of millions of users—to a situation that doesn’t lurch from one disinformation crisis to the next. And there’s no reason to think it couldn’t make progress toward that goal if only it invested more resources into the task.

“They need to hire more content moderators, a whole lot more of them,” said Jennifer Grygiel, a communications professor at Syracuse University. “It’s a myth to create this concept that it’s too big to moderate, there’s too much content.”

In 2019, CEO Mark Zuckerberg said Facebook would spend more than $3.7 billion on platform safety—more, he pointed out, than Twitter’s entire annual revenue. The much more relevant number, however, is Facebook’s revenue, which last year was about $70 billion. In other words, Zuckerberg was claiming credit for devoting just over 5 percent of the company’s revenue to making its product safe.

While Facebook barely cracked Forbes’ ranking of the 100 biggest companies by revenue last year, its $24 billion in profit before taxes easily made it one of the world’s most profitable. Why? Because its costs are so much lower than most other huge companies’. Ford Motor Company had $160 billion in revenue in 2018 but only cleared $4.3 billion in pretax profits. Building cars costs money. Ford faces tough competition from lots of other manufacturers, meaning it has pressure both to invest in making cars people want to drive and in charging prices people are willing to pay. Meanwhile, it must comply with extensive safety and emissions requirements imposed by the government.

Facebook faces none of those pressures. It has little real competition, and its business is almost wholly free from government regulation. That’s why the company remains at liberty to spend pretty much whatever it wants on content moderation and fact-checking. But there is no reason for this to be set in stone. Yes, Facebook sees millions of new posts per day, making moderation a daunting task. But Facebook is also sitting on billions and billions of dollars of cash. It could triple that $3.7 billion safety investment and still have an enviably high profit margin.

I don’t pretend to know the precise amount Facebook should be spending. Facebook currently says it employs 15,000 content moderators, most of whom are contractors. According to the Verge, those moderators make as little as $15 an hour in grueling conditions. A recent, in-depth report by New York University’s Stern Center for Business and Human Rights recommends that the company stop outsourcing content moderators and double their numbers to 30,000. But we can dream a little bigger. Facebook could triple its moderator workforce for less than a billion dollars. It could triple that workforce and double their salaries, so content moderators make a decent $60,000 per year, for a bit more than $2 billion. It’s tough to say exactly how much of an impact this would make, but it’s also tough to imagine that sextupling Facebook’s investment wouldn’t make a serious difference. A world in which only 1 million people were exposed to the Breitbart video, because a better-staffed and better-supported moderator corps was able to flag and deal with it faster, is better than a world in which 20 million people were. So is a world in which videos like that manage to spread far less often in the first place, even if they aren’t stamped out entirely.

It’s important to be clear about the nature of the problems posed by Facebook’s unparalleled size and dominance of social media. (Ditto the other giants, including Google, who will be interrogated at Wednesday's hearing.) It’s not really about the number of users. Few voices are calling for Facebook—the platform, not the company—to be split up. People will always want to be on the social network where everyone else is; it would be bizarre to try to shuffle Facebook’s existing users into different fiefdoms. Meanwhile, even if the government forced the company to spin off past acquisitions Instagram and WhatsApp—which would be a massive step for antitrust enforcement—Facebook would still have its 2 billion users, and the same issues of scale.

If Facebook’s size is a problem, it’s not because it’s impossible to scale up moderation. Rather, it’s because the company’s dominance of the social media market, coupled with a regulatory vacuum, allows it to pocket enormous profits no matter what it does. Enforcing the rules can be done; it just costs money. Not enforcing the rules has costs, too. They just end up on society’s balance sheet, not Facebook’s.

Related Articles

Latest Articles