27.1 C
New York
Tuesday, October 3, 2023

A Trump Ban Is Easy. Fixing Facebook and Twitter Will Be Hard

Welcome back. It’s 2021 and the world is doing great! Er … 

The Plain View

For months—years, really—people have asked what it would take for Facebook and Twitter to ban the policy-violator-in-chief from their platforms. Hate speech, doxing, and dangerous disinformation on Covid evidently weren’t enough. Oh, they put (easy-to-ignore) warning labels on some tweets and posts, and even took the stray one down. But exiling him from the platforms? No. He is the president, after all. Long ago (well, 2015), when he said hateful things about Muslims that would boot mere mortals from Facebook and Twitter, the platforms decided his newsworthiness was more relevant than his toxicity. (They didn’t say this, but their own political interests were also in play—the president controls bodies that regulate those platforms.) They set some “limits,” but those limits never seemed to be invoked.

That changed this week, when Donald Trump dispatched a cosplay mob of thugs and toy soldiers to take the Capitol—and they actually did. While he gave the actual marching orders in person, the invaders who came to Washington were fed by Trump’s avalanche of false claims and incitements on social media, hardly mitigated by warning labels or notices that other, perhaps more reliable sources were reporting something else. And on Wednesday, as the Capitol Rotunda was being breached and the Electoral College count interrupted, Trump was tweeting love notes to the terrorists.

Jack Dorsey and Mark Zuckerberg made a battlefield decision: They temporarily suspended the president’s account. Millions of users have had multiday penalties for relatively tame outbursts. But actually triggering a violent riot that halted the certification of the next government? That would keep Trump off Twitter until the next morning. Facebook’s hold was initially for 24 hours.

The next day, Mark Zuckerberg took the larger step of banning Trump “indefinitely and for at least the next two weeks,” just in case he might trigger the actual overthrow of the United States government. That would mean no Facebook for The Donald until the safe arrival of the Biden administration (at which point all the president’s tweets will be produced not by a maniac, but maniacally cautious wonks). Cynics also noted that it happened to be the same day the Senate turned blue, a development that provided some incentive for Facebook to ease its relentless pandering to conservatives. A friend of mine tweeted that the decision was like “kneeling down to end a football game.” (Twitter did not extend the suspension, and on Thursday evening Trump tweeted a video grudgingly admitting he might not be president after January 20. But I suspect by then he probably will gnaw through the short leash Twitter is presumably allowing him, and suffer a longer ban.)

Facebook might have run out the clock on Donald Trump’s posts—I predict a permanent ban at some point—but the episode is only one data point in a wider crisis of toxic expression on social platforms. A lot of attention has been paid to Section 230 of the 1996 Communications Decency Act, which allows platforms to moderate content without taking on legal responsibility for what users post. Many people in DC want to change or end that law. But the bigger question for Facebook and Twitter is, what kind of services do they want to be? One where comity rules, or one where divisive wedges poison society? Saying they want to be all hearts and flowers doesn’t mean anything. The question is what they want to do to get there.

A November 2020 New York Times article reported some instances where Facebook tinkered with ways to reduce misinformation and generally awful content. One, in an effort to tamp down conspiracy lunacy right after the election, assigned what it called N.E.Q. (news ecosystem quality) scores to articles, with reliable journalism ranked higher than lies and fantasy. It made for a “nicer News Feed.” But after a few weeks the company stopped the ranking scheme. In another experiment, Facebook trained a machine-learning algorithm to identify the kind of posts that were “bad for the world” and then demoted those in people’s feed. Indeed, there were fewer toxic posts. But people logged in to Facebook a bit less—and less time spent on Facebook is Mark Zuckerberg’s nightmare. The Times viewed an internal document where Facebook concluded,“The results were good except that it led to a decrease in sessions, which motivated us to try a different approach.”

I find that decision short-sighted. Maybe in the short term people would not log into Facebook quite so much. But that shortfall might challenge the company to concoct more wholesome features that would bring people back—and not feel so angry when they did use the service. Everyone would feel better, and fewer employees would threaten to quit because they feel that they are working for Satan.

When Facebook and Twitter began, neither founder suspected that their creations would be used to change public opinion, and certainly not to poison the body politic in the way Donald Trump did. The vision was to enrich people’s lives by letting them know what their friends were up to. But as their platforms grew, so did their ambitions. Zuckerberg set out to build Facebook as the ultimate personalized newspaper. Twitter positioned itself as “the Pulse of the Planet.”

In the past few years, however, it has been hard to look away from the consequences. The choice that the platforms face has little to do what is legal, and everything to do with what is right. Time and time again, when explaining why someone terrible remains on the platform, Zuckerberg invokes the company’s policies. But Facebook has things backwards when it invokes its own rules, as if it were referring to a tablet that some wonky Moses handed down. The company should more methodically examine the results of its policies, which in many cases scream wrong. Typically, Facebook defends a given outcome until enough people get disgusted at what is allowed to happen on its platform. Then it makes a change. That happened with anti-vaxxers, Holocaust denial, and now Donald Trump’s attempts to destroy democracy.

For now, of course, Zuckerberg is right when he says, “The priority for the whole country must now be to ensure that the remaining 13 days and the days after inauguration pass peacefully and in accordance with established democratic norms.” But after that, Mark Zuckerberg and Jack Dorsey have—in a term both utter a lot—“a lot of work to do.”

Time Travel

In 2017, I spoke to Jack Dorsey about how Twitter handles Donald Trump. Here’s an excerpt from our conversation:

Steven Levy: Should Twitter hold a president accountable to the same standards as other users? At Facebook, Mark Zuckerberg reportedly told employees he was not going to censor a nominee’s—and then a president’s—posts. Did you have to make a decision on that?

Jack Dorsey: We hold all accounts to the same standards on our policy, and we want to make sure that independent of who you are or where you’re coming from, you understand the guidelines, what our policies are, and what that means….

If someone complained about a Trump tweet, would you conceivably say, “This is unacceptable,” and then block the President of the United States?

We are going to hold all accounts to the same standards. Our policy does [account for] newsworthiness as well, and that was requested by our policy team. So we’re not taking something down that people should be able to report on and actually show that this is what the source said. It’s really important to make sure that we provide that source for the right reporting, and to minimize bias in articles.

Ask Me One Thing

Sergio, writing from Berlin, though he lives in Luxembourg and is originally from Mexico (TMI!), says, “The word ‘privacy’ has become a buzzword that everyone uses nowadays whenever talking about the risks of Big Tech. However, I believe its definition has morphed and evolved during the last 10-15 years into something new. What would be your definition of ‘privacy’ in this hyperconnected, non-stop, data-mining, automated and personalized world?”

Hi, Sergio. I hope you are being safe in Berlin, and not missing Luxembourg too much. I agree that the challenges of privacy have changed in the past decade or two, particularly as more information about us gets routinely collected and used to monitor us, sell us things, and sometimes even to steal from us. But I don’t think that’s changed the definition of privacy, at least as I see it. Privacy is about keeping what’s personal to us out of the hands of people who don’t need to see or access it. It’s so much harder to do that in this hyperconnected, non-stop, data-mining, automated and personalized world. But it would certainly help if we had stronger laws to help us. For instance, I would like a clear opt-in before anyone tracks my movements on the web or shares any information they gather about me with anyone else. And I would like to see huge fines and maybe even criminal penalties for those who violate such a law.

You can submit questions to mail@wired.com. Write ASK LEVY in the subject line.

End Times Chronicle

Hieronymus Bosch meets Steve Bannon in the Capitol Rotunda. God help us all.

Last but Not Least

As Zuckerberg’s vision for Facebook evolved, he wrote his thoughts in a secret notebook.

Twenty-five years ago, two men made a bet on whether civilization would collapse. This year the bet came due.

And 16 years ago, Darpa hosted the first autonomous car challenge.

Chinese researchers have taught a robot dog to fend off humans. Shouldn’t it be the other way around?

Also, CES is next week, and this year you don’t have to go to Las Vegas to join in! Free to all is WIRED HQ, a bunch of (virtual) live sessions with WIRED editors, along with guests like Slack’s Stewart Butterfield, Salesforce’s Bret Taylor, Nobel winner Jennifer Doudna, and former Secretary of Defense Ash Carter (wonder what he’ll say). I will also be chatting about the coming year in tech policy with our senior writer Gilad Edelman … and maybe you? Check out this page for details and to sign up.

Don't miss future subscriber-only editions of this column. Subscribe to WIRED (50% off for Plaintext readers) today.

Related Articles

Latest Articles