The White House and Facebook got into a feud over the past few days after President Biden and aides repeatedly accused the company of doing too little to combat Covid vaccine misinformation. On Thursday, US surgeon general Vivek Murthy released an advisory about health misinformation that included some fairly banal observations about how fake news spreads on social media. Then, White House spokeswoman Jen Psaki chastised Facebook for not banning users who violate its policies on vaccine content. Finally, on Friday, Biden was asked by a reporter what his message to platforms like Facebook was, and he replied, “They’re killing people.” The company responded testily, accusing the White House of scapegoating it and insisting that it has saved lives by displaying accurate Covid information. (On Monday, Biden said he had been referring to vaccine deniers, not Facebook, when he made the “killing” comment.)
If there’s one sentence encapsulating how dysfunctional the whole debate is, it’s this, from a Sunday Wall Street Journal article: “‘The reality is that misinformation is still spreading like wildfire in our country, aided and abetted by technology platforms,’ Dr. Murthy said on Fox News Sunday.”
It isn’t what Murthy said there that matters—it’s where he said it. The surgeon general was warning about how misinformation leads to vaccine hesitancy and, thus, preventable Covid deaths. He was appearing on a network that encourages viewers to fear and distrust the vaccines every morning and night. And yet Murthy aimed his criticism not at Fox, but at social media.
Vaccine hesitancy in the US is heavily partisan. As of early July, the average vaccination rate was nearly 12 percentage points higher in counties that voted for Biden than in counties that voted for Trump. This was hardly inevitable; before the pandemic, vaccine skepticism was a fringe phenomenon that didn’t skew along party lines. What changed was signaling from powerful conservatives. One recent experiment found that showing a clip of Donald Trump praising Covid vaccines made self-identified Republicans more likely to say they intend to get vaccinated. But that sort of messaging from conservative elites is rare. Instead, the dominant message is one of distrust.
Tucker Carlson, the host of the most popular show on cable news, with nearly 3 million nightly viewers, regularly brings vaccine skeptics on the air. He has said on-air that college students “shouldn’t get the shot,” and he recently referred to a Biden administration proposal to send people door-to-door to encourage vaccination as “the greatest scandal in my lifetime, by far.”
A Fox News spokesperson pointed out that several on-air personalities have recently promoted vaccines. On Monday morning, Steve Doocy, a host of Fox & Friends, urged viewers to “get the shot.” (But cohost Brian Kilmeade immediately pushed back, saying, “You make your own decision. It’s available to everyone. We’re not doctors.”) But these seem to be exceptions to a broader trend. The network’s most popular hosts are far less likely to encourage vaccination. According to the liberal watchdog group Media Matters, nearly 60 percent of vaccine-related segments on Fox News over a recent two-week period undermined the vaccination effort, usually by portraying it as government tyranny, medically risky, or both.
It is, in short, very hard to argue that any social network has anything close to the impact of Fox News and other conservative media on vaccine hesitancy. Even the evidence incriminating Facebook tends to implicate Fox even more: Data from CrowdTangle, a Facebook-owned analytics tool, shows that right-wing media figures, including those who sow doubt about vaccines, consistently drive the most engagement.
“All my research is on things that social media platforms can do to make things better,” said David Rand, a professor at MIT and one of the authors of the study testing the impact of Trump praising vaccines. “But I think TV and radio, particularly conservative TV and radio, are essentially getting a free pass right now, even though they’re doing amazing harm.”
The Biden administration’s criticism of Facebook is a double win for Fox News. Not only does it draw attention away from the network’s own culpability for the vaccination gap, but it feeds a potent right-wing narrative about government and Big Tech colluding to silence conservatives. “I just think that this kind of coordination between big government and the big monopoly corporation, boy, that is scary stuff. And it really is censorship,” Missouri senator Josh Hawley said Thursday on—where else?—Fox News. That sense of outrage easily sustained conservative media throughout the weekend, with both pundits and Republican lawmakers weighing in on, as Ted Cruz put it, “their willingness to trample on free speech, to trample on the Constitution, to use government power to silence you, everything we feared they might do.”
It’s easy to see why the White House would spend political capital beating up on Facebook rather than Fox News: Facebook might actually listen. Biden has no leverage over right-wing media. When a Fox News host questions the safety or wisdom of vaccination, it isn’t a lapse in enforcement; it’s tonight’s programming. Many people at Facebook, by contrast, would prefer not to be responsible for poisoning America’s public health information environment.
Which, according to Facebook, they aren’t. In a blog post last week, Guy Rosen, Facebook’s vice president of integrity, argued that Facebook has been a force for good when it comes to vaccinations. He noted that “more than 2 billion people have viewed authoritative information about Covid-19 and vaccines on Facebook” since the start of the pandemic, while the company has “removed over 18 million instances of Covid-19 misinformation.” And, he claimed, Facebook has already complied with all eight of the surgeon general’s recommendations—which would include Murthy’s suggestion that companies “give researchers access to useful data to properly analyze the spread and impact of misinformation.”
In fact, Facebook notoriously does not provide access to the data needed to understand what’s happening on its platform. Notice, for example, that Rosen’s blog post doesn’t mention how many times users have seen unreliable information about Covid or vaccines. Facebook publicizes statistics about engagement with posts—likes, shares, and so on—but refuses to disclose data about “reach,” meaning how many people see a piece of content. Nor does it provide any concrete details about its efforts to reduce the spread of misinformation.
“The public has no idea what Facebook is or is not doing to combat vaccine misinformation, and doesn’t have any sense of how bad or not-bad the problem is,” said Rand, the MIT professor. “There’s lots of work being done within the company by lots of smart people to try to reduce the impact of misinformation, but they don’t really tell much about it.”
Rand said platforms like Facebook should partner with outside researchers on empirical studies about what does and doesn’t work to combat vaccine misinformation—and publicize the results. He noted that Facebook is sitting on enough data to measure how exposure to posts about vaccines affect real-world behaviors. “They’re doing randomized controlled trials on vaccine misinformation every day, they just don’t think of it that way,” he said.
The irony is that, by providing some insight into how it approaches the problem, Facebook seems to have wandered into the worst possible balance between transparency and secrecy. YouTube makes comparatively little information available to researchers, helping it fly under the political and regulatory radar despite its massive importance. Facebook, meanwhile, provides just enough data through CrowdTangle for researchers and reporters to bludgeon the company—but then conceals the evidence that it claims would vindicate it.
“They’ve sort of painted themselves into a corner by giving enough data to make them look bad, but then saying, ‘Well, behind closed doors we have data that makes us look OK,’” said Jenny Allen, a doctoral student at MIT who is researching the comparative influence of social media and TV news. “It’s the worst of both worlds.”
Facebook could learn something from the vaccine development process itself. The reason it’s possible to talk about vaccine “misinformation” in the first place—why it isn’t purely a matter of opinion—is that the data behind the vaccines’ efficacy and risks have all been made public. No one with any sense will believe Facebook’s claims about its own public health interventions until it does likewise.