6.5 C
New York
Friday, March 29, 2024

Twitch's First Transparency Report Is Here—and Long Overdue

Twitch today released its first-ever transparency report, detailing its efforts to safeguard the 26 million people who visit its site daily. When it comes to transparency, the decade-old, Amazon-owned service had a lot of catching up to do.

Twitch benefitted from a 40 percent increase in channels between early and late 2020, buoyed by the popularity of both livestreaming technology and video gaming throughout the pandemic. That explosive growth, however, is also the company’s greatest challenge when it comes to stomping out harassment and hate. Unlike recorded videos, live content is often spontaneous and ephemeral. Things just happen, in front of live audiences of thousands or tens of thousands. That can include anything from 11-year-olds going live playing Minecraftexposing them to potential predators—to now-banned gaming celebrity Guy “Dr Disrespect” Beahm streaming from a public bathroom at E3.

In its new transparency report Twitch acknowledges this difficulty and for the first time offers specific details about how well it moderates its platform. While the findings are encouraging, what Twitch historically has not been transparent about speaks just as loudly.

Twitch early on earned a reputation as a hotbed for toxicity. Women and minorities streaming on the platform received targeted hate from audiences hostile to people whom they believed deviated from gamer stereotypes. Twitch’s vague guidelines around so-called “sexually suggestive” content served as fuel for self-appointed anti-boob police to mass-report female Twitch streamers. Volunteer moderators watched over Twitch’s fast-moving chat to pluck out harassment. And for problematic streamers, Twitch relied on user reports.

In 2016, Twitch introduced an AutoMod tool, now enabled by default for all accounts, that blocks what its AI deems inappropriate messages from viewers. Like other large platforms, Twitch also relies on machine learning to flag potentially problematic content for human review. Twitch has invested in human moderators to review flagged content, too. Still, a 2019 study by the Anti-Defamation League found that nearly half of Twitch users surveyed reported facing harassment. And a 2020 GamesIndustry.Biz report quoted several Twitch employees describing how executives at the company didn’t prioritize safety tools and were dismissive of hate speech concerns.

Throughout this time, Twitch didn’t have a transparency report to make its policies and inner workings clear to a user base suffering abuse. In an interview with WIRED, Twitch’s new head of trust and safety, Angela Hession, says that, in 2020, safety was Twitch’s “number one investment.”

Over the years, Twitch has learned that bad-faith harassers can weaponize its vague community standards, and in 2020 released updated versions of its “Nudity and Attire,” “Terrorism and Extreme Violence” and “Harassment and Hateful Conduct” guidelines. Last year, Twitch appointed an eight-person Safety Advisory Council, consisting of streamers, anti-bullying experts, and social media researchers, that would draft policies aimed at improving safety and moderation and healthy streaming habits.

Last fall Twitch brought on Hession, previously the head of safety at Xbox. Under Hession, Twitch finally banned depictions of the confederate flag and blackface. Twitch is on fire, she says, and there’s a big opportunity for her to envision what safety looks like there. “Twitch is a service that was built to encourage users to feel comfortable expressing themselves and entertain one another,” she says, “but we also want our community to always be and feel safe.” Hession says that Twitch has increased its content moderators by four times over the last year.

Twitch’s transparency report serves as a victory lap for its recent moderation efforts. AutoMod or active moderators touched over 95 percent of Twitch content throughout the second half of 2020, the company reports. People reporting that they received harassment via Twitch direct message decreased by 70 percent in that same period. Enforcement actions increased by 788,000 early 2020 to 1.1 million late 2020, which Twitch says reflects its increase in users. User reports increased during this time, too, from 5.9 million to 7.4 million, which Twitch again attributes to its growth. The same for its channel bans, which increased from 2.3 million to 3.9 million.

Twitch noted it had sent 2,158 tips to the National Center for Missing and Exploited children in 2020 (a 65 percent increase between early and late 2020), but had escalated instances of violence to law enforcement just 38 times. Between early and late 2020, Twitch processed 37 percent more subpoena requests, for a total of 226 throughout the year.

After building up its robust community and culture, Twitch seems to have made a recent and heavy back-end investment in moderation after years of challenges relating to harassment and hate. “Our approach is building the foundation—when you’re thinking about a social service—of what your guidelines are going to be along with the technology,” says Hession.

Twitch also will work on its “off-services” policies which will guide how Twitch investigates and acts on situations where users are implicated in serious crimes, like sexual assault. It comes a little late: Last year, dozens of women came forward with allegations of inappropriate or harmful behavior against Twitch streamers. And it’s not a new issue; streamers have for years leveraged their platform to groom young, impressionable fans.

In an interview with WIRED, Hession would not share how many moderators Twitch employs or whether they are full-time or contractors, like the majority of Facebook’s. (The transparency report does note that mods are able to view potentially traumatizing content muted and in black and white and receive wellbeing services). And while Hession agreed that it is important for both streamers and mods to know whether or why bans happen, she would not share why Twitch banned Guy “Dr Disrespect” Beahm nearly eight months ago. “I can’t comment on Dr. Disrespect for privacy reasons,” says Hession, “but we try to be as clear as possible and create that level of trust.” Beahm has said he does not know why he was banned.)

Hession was also vague when it came to areas Twitch would want to take accountability, or admit that it had erred. “I believe this is a journey and there’s definitely not an end date,” she says. “We’re very humble in saying we could always do better. If there’s one infraction where we don’t prevent a child from streaming or we feel there’s toxicity, we will always need to do better.” She pointed to the transparency report, adding that Twitch is taking public, detailed accountability.

As Twitch has gone mainstream, nearly every type of person has used it. And the platform appears invested in caring for this burgeoning and diverse user base. At the same time, part of what made Twitch so big in the first place was its tolerance of the worst aspects of gamer culture. Systems can be added in; culture cannot be so easily extinguished.

Twitch will release an updated accountability report twice a year, offering an opportunity to hold itself to the standards that it has established. Hession points out that transparency reports are typically written with regulators and advertisers in mind. “ “However, we always think about the community as well,” she says.

Related Articles

Latest Articles