9 C
New York
Thursday, March 28, 2024

Facebook Will Crack Down on Anti-Vaccine Content

As Clark County, Washington, combats an ongoing measles outbreak, Facebook announced Thursday that it’s diminishing the reach of anti-vaccine information on its platform. It will no longer allow it to be promoted through ads or recommendations, and will make it less prominent in search results. The social network will not take down anti-vaccine posts entirely, however. The company also said it was exploring ways to give users more context about vaccines from “expert organizations.”

The decision was widely anticipated: Facebook, along with YouTube and Amazon, has faced criticism from journalists and lawmakers in recent weeks for allowing vaccine misinformation to flourish on their sites. Facebook also told media outlets in February that it was looking into how it should address anti-vaccination content.

Last month, Adam Schiff, a Democratic representative from California, sent letters to the CEOs of YouTube and Facebook demanding they answer questions about the spread of anti-vaccine information on their company’s platforms. He followed up with a similar letter to Amazon CEO Jeff Bezos last week. On Wednesday, an 18-year-old from Ohio testified before the Senate that his mother primarily received misinformation about vaccines on Facebook and opted not to inoculate him. (A major study released Monday found no link between the MMR vaccine—which protects against measles, mumps, and rubella—and autism.)

In a blog post written by Monika Bickert, Facebook’s vice president of global policy management, Facebook said it will begin rejecting ads that include false information about vaccinations. The company also removed targeting categories such as “vaccine controversies” from its advertising tools. Last month, the Daily Beast reported that more than 150 anti-vaccine ads had been bought on Facebook, which often targeted women over 25. Some of the ads were shown to users “interested in pregnancy.” In total, they were viewed at least 1.6 million times. YouTube similarly announced last month that it would begin preventing ads from running on videos featuring anti-vaccine content.

Facebook will also reduce the ranking of pages and groups that spread misinformation about vaccines in search results and in its News Feed. In February, The Guardian found that anti-vaccination propaganda often ranked higher and outperformed accurate information from more reliable sources on Facebook.

The social network’s effort to fight vaccine disinformation extends to Instagram, where the company says it will stop recommending content that includes vaccine misinformation on the app’s Explore page. Instagram will also stop displaying vaccination misinformation in hashtag search results. It’s not clear how long these new controls will take to roll out: An Instagram search for #vaccine Thursday afternoon surfaced the hashtag #vaccineskill as the number one result, for instance. Last month, Pinterest received praise for its decision to stop displaying search results for vaccines entirely, even if they are medically accurate. (In 2017, Pinterest previously banned “anti-vaccination advice” from its platform.)

As The Atlantic has pointed out, the majority of anti-vaccination content on Facebook appears to originate from only a handful of fringe sources. It likely won’t require a herculean effort for Facebook to tackle this strain of misinformation. The question is why the company waited until it became the subject of media reports and criticism from lawmakers to finally act.

Facebook increased its efforts to fight false information more broadly on the platform in the wake of the 2016 presidential election, including with initiatives like third-party fact-checking. The company admits it won’t catch everything, and demonstrably fake stories still do go viral. While there is little public data about user behavior on Facebook, researchers have found signs that the reach of fake news declined between 2016 and 2018 midterm elections. (Though they also say there remains plenty to be concerned about when it comes to misinformation.)

It’s not yet clear whether the proliferation of anti-vaccination content online has led to a significant decrease in vaccination rates in the United States. Unscientific information about vaccines has been circulating on- and offline for well over a decade. But as Slate has pointed out, the number of children under 3 who have received their first dose of the MMR vaccination has remained steady for years, according to data from the Centers for Disease Control and Prevention. The World Health Organization named vaccine hesitancy one of its “ten threats to global health in 2019,” but cites “complacency and inconvenience in accessing vaccines” as two of the key reasons why people choose not to vaccinate, in addition to “lack of confidence.”

There’s still little doubt that social media platforms like Facebook, but also YouTube and Amazon, have indeed made anti-vaccination talking points more accessible to wider audiences. The proponents of this misinformation were aided by recommendation and search ranking algorithms, which often promoted anti-vax content to the top of the pile. Facebook’s announcement today is further acknowledgment of its role in that ecosystem, and the idea that free speech is not the same as free reach.

Related Articles

Latest Articles