As TikTok ballooned in size and cultural influence over the past few years, the social video platform remained noticeably dinky compared to its Silicon Valley competitors in one area: its Community Guidelines. Whereas Twitter, Facebook, and YouTube have developed lengthy policies covering hundreds of circumstances, TikTok provided a short list of bullet points addressing only the most extreme content. Amid growing scrutiny over censorship and its Chinese ownership from regulators, the company has pledged that more transparency would be forthcoming. Now, it’s here.
On Wednesday, TikTok released a comprehensive set of new Community Guidelines that more closely resemble those of its peers, which go into effect immediately. The rules are far more extensive than the previous bullet points, and are organized into 10 distinct categories covering everything from terrorist propaganda to hate speech to sexual content. Given the app’s famously young user base, it’s no surprise there’s a focus on child safety, including a new ban on videos depicting minors “consuming, possessing, or suspected of consuming alcoholic beverages, drugs, or tobacco,” which, yes, likely means vaping too. The company previously said it planned to invite outside experts to review some of its policies, but this revamp involved only internal staff led by TikTok teams based in California.
In response to questions about the role its Chinese parent company plays in moderation decisions, TikTok has often stressed that it takes a localized approach: Different rules apply to different countries, where the local laws and customs vary. That remains true, but TikTok’s new Community Guidelines will be overarching and apply globally. The rules “are the basis of the moderation policies TikTok's regional and country teams localize and implement in accordance with local laws and norms,” Lavanya Mahendran and Nasser Alsherif, members of TikTok’s global trust and safety team, wrote in a blog post.
TikTok’s Community Guidelines follow the lead of the world’s biggest social media companies in many cases, and go even further in some instances. The app defines hate speech as attacking or inciting violence against people based on characteristics like their sexual orientation, gender identity, or immigration status. According to TikTok, that includes “promoting or justifying exclusion, segregation, or discrimination” against certain groups. The rules would appear to cover ideologies like white nationalism, which Facebook only recently decided it would no longer allow on its platform after significant backlash.
While many platforms ban depicting illegal acts, TikTok specifically prohibits what it calls “underage delinquent behavior,” like minors consuming alcohol, drugs, or tobacco. Videos of teens using vaping devices from companies like Juul have become a meme on TikTok, but the new rules indicate it might not be for this world much longer. It’s not clear how the company will handle drinking and smoking age laws that vary from country to country, or jurisdictions where marijuana is legal.
TikTok now bans “the depiction, trade, or promotion of firearms,” except in circumstances like when they’re carried by officials like police officers, or “used in a safe and controlled environment such as a shooting range.” The policy appears to be stricter than that of either YouTube or Facebook, which ban selling guns and instructing people how to make them but not their mere depiction.
Users who break the rules will have their content deleted and receive a notification that they broke TikTok’s Community Guidelines. But the company won’t tell people which specific policy was violated, leaving them to interpret for themselves where they crossed the line. TikTok hasn’t made any additional announcements about changes to moderation staffing or other enforcement mechanisms in addition to its new Community Guidelines.
Is there something you think we should know about TikTok? Email the writer at email@example.com. Signal: 347-966-3806. WIRED protects the confidentiality of its sources, but if you wish to conceal your identity, here are the instructions for using SecureDrop. You can also mail us materials at 520 Third Street, Suite 350, San Francisco, CA 94107.
TikTok, which revolves around sharing short video clips set to music, has often tried to position itself as a destination for things like goofy dancing videos, rather than news or political debate. But its new Community Guidelines are an acknowledgement of the fact that it is home to more than just lip-syncing clips. “Spending time on TikTok is meant to be rewarding and fun. That doesn't mean serious or controversial content doesn't have a place on our platform; ultimately, the platform is built to support our users and their diverse thoughts, experiences, and interests,” wrote Mahendran and Alsherif in their blog post.
Originally, TikTok took a far more blunt approach to content moderation, seemingly in an effort to minimize any chance of controversy. Guidelines obtained by The Guardian last fall showed TikTok staff were instructed to censor topics deemed sensitive by the Chinese government, as well as LGBT content in some markets. TikTok said the instructions were outdated, and there is nothing restricting similar posts its new rules.
But suspicion about the app remains, especially because TikTok is owned by the Chinese tech giant ByteDance. In November, the company received a wave of negative press after it erroneously blocked the account of an American teenager who had posted a viral video about the internment of over a million Muslims in China. TikTok later apologized to the girl and explained that a technical error was responsible, but the damage was largely already done.
The incident underscores the challenges ahead for TikTok. Last month, the company released its first transparency report, which discloses requests for TikTok data from governments, but doesn’t reveal data about how the company has enforced its own policies. Now that its new Community Guidelines are in place, users will be watching to see how well it implements them.