By Cormac Keenan, Head of Trust and Safety
Today we're announcing updates to our Community Guidelines to further support the well-being of our community and the integrity of our platform. Transparency with our community is important to us, and these updates clarify or expand upon the types of behavior and content we will remove from our platform or make ineligible for recommendation in the For You feed. We routinely strengthen our safeguards so that TikTok can continue to bring people together to create, connect, and enjoy community-powered entertainment long-term.

Building a safe and secure entertainment platform
At TikTok, we believe people should be able to express themselves creatively and be entertained in a safe, secure, and welcoming environment. Our Community Guidelines support that by establishing a set of norms so that people understand what kinds of content to create on our platform and viewers know what to report to us. Our policies are designed to foster an experience that prioritizes safety, inclusion, and authenticity. They take into account emerging trends or threats observed across the internet and on our platform. We also listen to feedback from our community, our APAC Safety Advisory Council, and other experts in areas like digital safety and security, content moderation, health and well-being, and adolescent development.
Some of the main updates we're announcing today and implementing over the next few weeks include:
- Strengthening our dangerous acts and challenges policy. We continue to enact the stricter approach we previously announced to help prevent such content - including suicide hoaxes - from spreading on our platform. This previously sat within our suicide and self-harm policies, but will now be highlighted in a separate policy category with more detail so it's even easier for our community to familiarize themselves with these guidelines.
- Broadening our approach to eating disorders. While we already remove content that promotes eating disorders, we'll start to also remove the promotion of disordered eating. We're making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behavior without having an eating disorder diagnosis. Our aim is to acknowledge more symptoms, such as overexercise or short-term fasting, that are frequently under-recognized signs of a potential problem. This is an incredibly nuanced area that's difficult to consistently get right, and we're working to train our teams to remain alert to a broader scope of content.
- Adding clarity on the types of hateful ideologies prohibited on our platform. This includes deadnaming, misgendering, or misogyny as well as content that supports or promotes conversion therapy programs. Though these ideologies have long been prohibited on TikTok, we've heard from creators and civil society organizations that it's important to be explicit in our Community Guidelines.
- Expanding our policy to protect the security, integrity, availability, and reliability of our platform. This includes prohibiting unauthorized access to TikTok, as well as TikTok content, accounts, systems, or data, and prohibiting the use of TikTok to perpetrate criminal activity. In addition to educating our community on ways to spot, avoid, and report suspicious activity, we're opening state-of-the-art cyber incident monitoring and investigative response centers in Washington DC, Dublin, and Singapore this year. TikTok's Fusion Center operations enable follow-the-sun threat monitoring and intelligence gathering, as we continue working with industry-leading experts to test and enhance our defenses.
Additionally, our community can find more information about the