By Cormac Keenan, Head of Trust & Safety, TikTok

Today we're announcing refreshed Community Guidelines with additional details about what's allowed on TikTok. Our Community Guidelines support the authentic and entertaining TikTok experience that people know and enjoy. They define a common code of conduct and encourage a welcoming community environment. We continually review and strengthen our policies to help everyone feel comfortable and safe to create and share on TikTok.

Promoting inclusive policies

At TikTok, safety isn't a nice-to-have or an afterthought; it's central to all our work, and our teams strive to be inclusive and thoughtful when developing our policies. Our guidelines apply to everyone and all content on TikTok and broadly cover 10 categories of content. This update adds more specifics to each area based on behavior we've seen on platform, feedback we've heard from our community, and input from academics as well as civil society organisations.

While much of this content was covered by our previous guidelines, we're highlighting some of the key areas we've strengthened to better support the well-being of our community.

  • We want our community to feel comfortable and confident expressing themselves exactly as they are. Our updated guidelines incorporate feedback and language used by mental health experts to improve our policies on self-harm and suicide content and avoid normalising self-injury behaviors. Our policy on eating disorder content has additional considerations to prohibit normalising or glorifying dangerous weight loss behaviors.
  • We recognize the burden victims of abuse often face in managing their online presence. We've bolstered our policies on bullying and harassment and our guidelines are now more explicit about the types of content and behaviors that aren't welcome on TikTok, including doxxing, cyberstalking, and a more extensive policy against sexual harassment.
  • The safety of everyone in our community is of utmost importance, especially the well-being of youth. In line with our existing dangerous acts policy, we work to either limit, label, or remove content that depicts dangerous acts or challenges. Now, we've added a harmful activities section to our minor safety policy to reiterate that content promoting dangerous dares, games, and other acts that may jeopardise the safety of youth is not allowed on TikTok. We encourage people to be creative and have fun, but not at the expense of an individual's safety, or the safety of others.
  • TikTok stands firmly against violence, both online and off. We've updated our previous dangerous individuals and organisations policy to focus more holistically on the issue of violent extremism. Our guidelines now describe in greater detail what's considered a threat or incitement to violence and the content and behavior we prohibit.  

As we develop inclusive policies, we continually work to make TikTok more accessible for everyone. We recently announced new tools to support people with photosensitive epilepsy, and we’re starting to roll out a text-to-speech feature that allows people to convert typed text to voice that plays over text as it appears in a video. 

New resources to support wellbeing

As we navigate challenging subjects like self-harm, compassion for survivors is front of mind. Over the coming week, we'll roll out updated resources to support people who may be struggling. These resources were created with guidance from leading behavioral psychologists and suicide prevention experts, including Providence, Samaritans of Singapore, Live for Tomorrow, and members of our US Content Advisory Council. Now, if someone searches for terms like "selfharm" or "hatemyself" they'll see evidence-based actions they can take. In Australia, access to Lifeline and Kids Helpline is available for those looking for emotional support.

We're also introducing opt-in viewing screens on top of videos that some may find graphic or distressing. These types of videos are already ineligible for recommendation into anyone's For You feed, and this feature aims to further reduce unexpected viewing of such content by offering viewers the choice to skip the video or watch it.


We continue to develop tools to help people manage their TikTok experience, from automatically filtering unwanted comments to the ability to say "not interested" on videos in their For You feed. This is especially important in our efforts to support people who want to share their story and use their voice to raise awareness on topics others may find triggering. 

Since the start of the pandemic, TikTok has provided access to public health information from experts in our app and relief for frontline workers and families. We're proud when our community comes together through memorable challenges – like #wipeitdown, #heapsgood and #happyathome - that connect us and bring joy during difficult times. As COVID-19 vaccines are developed and approved, we're furthering our efforts to support the well-being of our community by making authoritative information about vaccines readily available. At the same time, we continue to remove misinformation about the coronavirus and vaccinations.

Over the coming week our in-app coronavirus resource hub will be updated with commonly asked questions and answers about COVID-19 vaccines from public health experts. Our coronavirus resource hub is accessible from our Discover page, search results, and banners on COVID-19 and vaccine-related videos and has been viewed over 2 billion times globally over the last six months. We're also partnering with Team Halo so that scientists all over the world can share the progress being made on the vaccine through video updates.

Educating our community 

We work to educate and empower our community on our policies through in-app videos, notifications, and safety tools. This month when a user opens up TikTok, they'll be prompted to review these refreshed guidelines.

@tiktoknewsroom

We're updating our Community Guidelines. Watch to learn some of what's new!

♬ original sound - TikTok Newsroom

Keeping our community safe is a commitment with no finish line. We recognize the responsibility we have to our users to be nimble in our detection and response when new kinds of content and behaviors emerge. To that end, we'll keep advancing our policies, developing technology to automatically detect violative content, building features that help people manage their TikTok presence and content choices, and empowering our community to help us foster a trustworthy environment. Ultimately, we hope these updates enable people to have a positive and meaningful TikTok experience.