By Cormac Keenan, Head of Trust and Safety
Today we're announcing updates to our Community Guidelines to further support the well-being of our community and the integrity of our platform. Transparency with our community is important to us, and these updates clarify or expand upon the types of behaviour and content we will remove from our platform or make ineligible for recommendation in the For You feed. We routinely strengthen our safeguards so that TikTok can continue to bring people together to create, connect, and enjoy community-powered entertainment long-term.
Building a safe and secure entertainment platform
At TikTok, we believe people should be able to express themselves creatively and be entertained in a safe, secure, and welcoming environment. Our Community Guidelines support that by establishing a set of norms so that people understand what kinds of content to create on our platform and viewers know what to report to us. Our policies are designed to foster an experience that prioritises safety, inclusion, and authenticity. They take into account emerging trends or threats observed across the internet and on our platform. We also listen to feedback from our community, our European Safety Advisory Council, and other experts in areas like digital safety and security, content moderation, health and well-being, and adolescent development.
Some of the main updates we're announcing today and implementing over the next few weeks include:
- Strengthening our dangerous acts and challenges policy. We continue to enact the stricter approach we previously announced to help prevent such content - including suicide hoaxes - from spreading on our platform. This previously sat within our suicide and self-harm policies, but will now be highlighted in a separate policy category with more detail so it's even easier for our community to familiarise themselves with these guidelines.
- Broadening our approach to eating disorders. While we already remove content that promotes eating disorders, we'll start to also remove the promotion of disordered eating. We're making this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behaviour without having an eating disorder diagnosis. Our aim is to acknowledge more symptoms, such as overexercise or short-term fasting, that are frequently under-recognised signs of a potential problem. This is an incredibly nuanced area that's difficult to consistently get right, and we're working to train our teams to remain alert to a broader scope of content.
- Adding clarity on the types of hateful ideologies prohibited on our platform. This includes deadnaming, misgendering, or misogyny as well as content that supports or promotes conversion therapy programs. Though these ideologies have long been prohibited on TikTok, we've heard from creators and civil society organisations that it's important to be explicit in our Community Guidelines.
- Expanding our policy to protect the security, integrity, availability, and reliability of our platform. This includes prohibiting unauthorised access to TikTok, as well as TikTok content, accounts, systems, or data, and prohibiting the use of TikTok to perpetrate criminal activity. In addition to educating our community on ways to spot, avoid, and report suspicious activity, we're opening state-of-the-art cyber incident monitoring and investigative response centres in Washington DC, Dublin, and Singapore this year. TikTok's Fusion Center operations enable follow-the-sun threat monitoring and intelligence gathering, as we continue working with industry-leading experts to test and enhance our defenses.
Additionally, our community can find more information about the content categories ineligible for recommendation into For You feeds. While the ability to discover new ideas, creators, and interests is part of what makes our platform unique, content in someone’s For You feed may come from a creator they haven’t chosen to follow or relate to an interest they haven’t previously engaged with. That’s why when we come across content that may not be appropriate for a general audience, which includes everyone from teens to great-great-grandparents, we do our best to remove it from our recommendation system.
Every member of our community will be prompted to read our updated guidelines when they open our app in the coming weeks.
Staying accountable to our community
The strength of a policy lies in its enforceability. Our Community Guidelines apply to everyone and all content on TikTok, and we strive to be consistent and equitable in our enforcement. We use a combination of technology and people to identify and remove violations of our Community Guidelines, and we will continue training our automated systems and safety teams to uphold our policies.
To hold ourselves accountable to our community, NGOs, and others, we release Community Guidelines Enforcement Reports quarterly. Our most recent report, published today, shows that over 91 million violative videos were removed during Q3 2021, which is around 1% of all videos uploaded. Of those videos, 95% were removed before a user reported it, 88% before the video received any views, and 93% within 24 hours of being posted. We continue to expand our system that detects and removes certain categories of violations at upload – including adult nudity and sexual activities, minor safety, and illegal activities and regulated goods. As a result, the volume of automated removals has increased, which improves the overall safety of our platform and enables our team to focus more time on reviewing contextual or nuanced content, such as hate speech, bullying and harassment, and misinformation.
We've made significant strides to improve our policies and enforcement, including our efficacy, speed, and consistency, though we recognise there's no finish line when it comes to keeping people safe. We're driven by our passion to help everyone have a good and enriching experience on TikTok.