8.4 C
New York
Thursday, March 28, 2024

Google Search Labeled the California GOP as Nazis, But It's No Conspiracy

If you Googled the California Republican Party earlier this week, the so-called "knowledge panel" that's supposed to surface the most relevant results would have told you that the party's primary ideologies are conservatism, market liberalism, and, oh, Nazism.

Conservatives have been quick to point fingers at Google and other tech giants, claiming another example of perceived liberal bias in Silicon Valley. In a tweet Thursday, House majority leader Kevin McCarthy posted a screenshot of the result—first reported by Vice—and called it a "disgrace." But in reality, the result in question has far less to do with any widespread scheme within Big Tech to defame the Republican party than it does with Google's imperfect reliance on Wikipedia and other easily manipulated open platforms to populate its search results.

According to a Google spokesperson, the Wikipedia page for the California Republican Party was "vandalized" so that Nazism was listed as one of its core ideologies. Wikipedia's change logs confirm that assertion, and show that the edit was live from May 24 to May 30. Because Google scrapes Wikipedia to populate the knowledge panel, the short-lived change slipped into search results.

"We have systems in place that catch vandalism before it impacts search results, but occasionally errors get through, and that's what happened here," the spokesperson said, adding that when the issue was brought to the company's attention, they removed it. "This was not the result of any manual change by anyone at Google. We don't bias our search results toward any political party."

Representative McCarthy's office did not respond to WIRED's request for comment, which explained the error.

>

This is certainly not the first time problematic results have shown up in both Google search results and its autocomplete suggestions. Earlier this year, WIRED found that typing the term "Islamists are" into Google turned up the suggested search "Islamists are evil," and typing "Hitler is" turned up the suggestion "Hitler is my hero." Safiya Noble, a professor at the University of Southern California, wrote a whole book on the topic of disturbing and biased search results called Algorithms of Oppression.

For Google, the only way to sift through an entire planet full of information is to devise algorithmic ways to find the best content human beings have produced online. But often, it still surfaces the worst.

Even knowing these risks, the tech industry writ large has leaned on platforms like Wikipedia to solve the problems of filtering out so much human-generated garbage. Earlier this year, Google's sibling company YouTube announced that the company will begin publishing so-called "information cues" alongside conspiracy theory videos. Those cues will include content directly from Wikipedia that, ideally, debunks the conspiracy theory. Facebook, meanwhile, is testing a button that allow users reading an article to get additional context about the topic, some of it lifted from Wikipedia as well.

You don't need to look far to see how all of this can go terribly wrong. Wikipedia may be the world's most successful attempt at crowdsourcing knowledge, but its pages are also susceptible to abuse, mischaracterization, and arbitrary changes. In March, for instance, WIRED reported the Wikipedia page for the New York Daily News categorized it as a centrist paper. On Thursday, it showed up as "populist." The proudly far-right site Breitbart News, meanwhile, contains no so-called "political alignment," because Wikipedia saves those distinctions for physical newspapers rather than websites.

Silicon Valley's leaders have held tight to the idea that they shouldn't be the arbiters of truth. YouTube doesn't want to censor conspiracy theorists, nor does Facebook want to ban the users who spread them. Wary of the exact charges of bias they're now facing, tech companies have offloaded the work of discerning the truth. The approach has backfired badly, and not for the first time.

It's clear in this case that Google's algorithms screwed up. But it's not evidence of Google's internal bias, and the fix didn't come from a cover-up or conspiracy. Spreading the misleading notion that it's either only perpetuates the very problem tech companies are already trying—and failing—to fix in the first place.

Additional reporting contributed by Louise Matsakis.

Related Articles

Latest Articles