The Facebook Oversight Board is often described as a “Supreme Court” for Facebook. On Wednesday, it acted like it—issuing a finely grained ruling that punts the hardest question posed to it back down for Mark Zuckerberg to deal with.
The issue before the board, in case you haven’t turned on the news or checked Twitter this week, was whether to uphold Facebook’s indefinite ban of Donald Trump’s account following his role in inciting the January 6 riot at the Capitol. It was, by far, the most hotly anticipated decision in the Oversight Board’s young existence. Since the company referred the case to the board on January 21, it received over 9,000 public comments on the matter. As of Wednesday, the Trump ban remains in place—but the decision still isn't final.
Specifically, Facebook asked the Oversight Board to decide:
Considering Facebook’s values, specifically its commitment to voice and safety, did it correctly decide on January 7, 2021, to prohibit Donald J. Trump’s access to posting content on Facebook and Instagram for an indefinite amount of time?
The board’s answer was yes—and no. Yes, Facebook was right to suspend Trump’s account; no, it was wrong to do so indefinitely. “In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities,” the board wrote in its decision. “The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.” In other words, Facebook must decide whether to let Trump back immediately, place a clear end date on his suspension, or kick him off its platforms forever.
While the board took Facebook to task for refusing to take a clearer stand, it also endorsed the immediate logic of the takedown. The original decision to deactivate Trump’s account was made under extraordinary circumstances. With the violent attack on the US Capitol still raging, Trump made a series of posts, including a video, in which he told his followers to go home—but in which he also repeated the false claim that the election had been stolen, the very idea motivating his rioting supporters. “This was a fraudulent election, but we can’t play into the hands of these people,” he said in the video. “We have to have peace. So go home. We love you. You’re very special.” By the next day, Facebook had taken the posts down and suspended Trump entirely from its platform, as well as from Instagram and WhatsApp. (Twitter and YouTube did likewise.)
It was clear all along that the content of the offending posts was far from Trump’s most egregious—after all, he was at least telling the rioters to go home—and didn’t obviously violate any clear rule. Trump had been using Facebook to broadcast the stolen-election myth for months, after all. What had changed was not Trump’s online behavior, then, but the offline consequences of it. In a blog post explaining Facebook’s decision, Mark Zuckerberg tacitly recognized as much. “We removed these statements yesterday because we judged that their effect—and likely their intent—would be to provoke further violence,” he wrote. While the platform previously tolerated Trump, “the current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government.” Trump would remain banned “indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”
The decision was a striking departure from Facebook’s normal approach to moderation in two ways. First, the company explicitly looked not just at the content of the posts but at the real-world context. Second, it departed from its “newsworthiness” rule that generally gives political leaders extra leeway to break the rules, on the theory that people deserve to know what they have to say.
The Oversight Board strongly endorsed both of those decisions. “At the time of Mr. Trump’s posts, there was a clear, immediate risk of harm and his words of support for those involved in the riots legitimized their violent actions,” it wrote. “Given the seriousness of the violations and the ongoing risk of violence, Facebook was justified in suspending Mr. Trump’s accounts.” In other words, Facebook was right to take into account both the size of Trump’s megaphone and the risk of real-world harm.
In its policy recommendations, the board advised the company to apply those two ideas moving forward. “While the same rules should apply to all users, context matters when assessing the probability and imminence of harm,” it wrote. This echoes the advice of many outside organizations, from human rights groups like Witness to the libertarian Cato Institute, that submitted comments urging Facebook to stop giving powerful public figures special privileges. “If a speaker’s status as a government official or celebrity makes their every utterance newsworthy, the newsworthiness exception serves as a formalization of Donald Trump’s claim that ‘when you’re a star, they let you do it,’” wrote Will Duffield, a policy analyst at the Cato Institute.
While Facebook must honor the board’s instructions on Trump’s account, it is not obligated to follow its policy suggestions. Whether it does or not may ultimately be more consequential than how it disposes of the Trump question.
“What the board says about the treatment of public figures and politicians more generally, and especially around the world, is only a recommendation to Facebook,” said Evelyn Douek, a doctoral student at Harvard Law School who studies platform moderation, on Tuesday. “I suspect no one will pay attention because everyone is so focused on Trump. He is the black hole into which all our attention gets sucked. But to me, that’s the far more interesting question and has global ramifications.”
As for Trump, the world will have to wait for an answer: The Oversight Board has given Facebook six long months to decide what to do. It’s possible that the final decision won’t matter terribly either way. Deplatforming Trump has been strikingly effective at pushing him out of the national dialog—one January analysis found a 73 percent drop in online misinformation about election fraud following the ban—but the reason for that is much less about Facebook than about Twitter, which has already indicated that it won’t let Trump back on. Twitter was always Trump’s true home on social media, because it gave him something any politician craves: media attention. Trump’s bizarre Twitter use—lobbing personal insults, winkingly amplifying conspiracy theories, announcing administration personnel changes, threatening foreign leaders—was irresistible to journalists, who make up a wildly disproportionate share of heavy Twitter users. It helped him hijack the media attention span in a way that is hard to picture happening via other platforms.
Not that he won’t try. This week, Trump launched “From the Desk of Donald Trump,” a new platform for the president to blast his unmediated thoughts into the world. Picture Twitter, but Trump’s is the only account on it. (So far, at least.) You might call it … a blog.
The biggest obstacle Trump faces to reaching his previous levels of zeitgeist penetration: He is no longer president. During his administration, the inevitably of a Trump tweet was such that cable news channels built templates to be able to quickly swap in the latest missive and do a segment on it. They could build those templates for Trump’s blog posts, but they probably won’t, because he is no longer the most powerful man in the world. (That might change if he runs for office again in 2024, but let’s take things one day at a time.)
Regardless of Facebook’s final decision, the fact that the Oversight Board upheld even a temporary ban on a head of state should send a message to other leaders, both in the US and around the world.
“It’s also possible that it incentivizes those politicians to be more careful with what they post on Facebook, if they know that they might lose their accounts—and permanently—because their accounts are important to them and important for reaching their constituents,” Douek said.
“It kind of says politicians’ accounts are fair game.”