by Julie de Bailliencourt, Global Head of Product Policy, TikTok

Today we are refreshing our Community Guidelines. These are the rules and standards for being part of the TikTok community, which is now more than 150 million people in the United States and more than 1 billion worldwide. These rules apply to everyone and everything on our platform.

As part of this, for the first time, we're sharing TikTok's Community Principles to help people understand our decisions about how we work to keep TikTok safe and build trust in our approach. These principles are based on our commitment to uphold human rights and aligned with international legal frameworks.

These principles guide our decisions about how we moderate content, so that we can strive to be fair in our actions, protect human dignity, and strike a balance between freedom of expression and preventing harm.

To inform the most comprehensive updates to our Community Guidelines to date, we consulted more than 100 organisations around the world, including our US Content Advisory Council, and members of our community. Their input helped us strengthen our rules and respond to new threats and potential harms. Some of the key changes are:

  • advancing our rules for how we treat synthetic media, which is content created or modified by AI technology;
  • adding 'tribe' as a protected attribute in our hate speech and hateful behaviour policies;
  • more detail about our work to protect civic and election integrity, including our approach to government, politician and political party accounts.

The new Community Guidelines will take effect on 21 April. Over the coming months, we will provide additional training to our moderators in order to help enforce these updated rules and standards effectively as they start to roll out.

We've brought together in one place everything about our rules and standards so that everyone from creators to researchers can get what they need, based on the feedback we've heard. We've overhauled how we organise our rules thematically into different topic areas, and for each of these (for example, Behavioural & Mental Health), we first explain in brief what we don't allow, and we then provide more details, such as definitions and the range of actions we might take. We've also laid out the four pillars of our approach to moderation:

  1. remove violative content;
  2. age-restrict mature content so it is only viewed by adults (18 years or older). (As a reminder, this content must still abide by our Community Guidelines);
  3. make content ineligible for recommendation in the For You feed that isn't appropriate for a broad audience;
  4. empower our community with information tools and resources to stay in control of their experience.

Today's update to our Community Guidelines also expands on our enforcement strategy by:

  • Sharing more information about the actions we take against accounts that violate our rules, following our update earlier this year, and clarifying that we do not allow the use of multiple accounts to intentionally bypass our rules or their enforcement.
  • Explaining the considerations we take into account when we enforce our rules based on public interests, and our approach to content that critiques public figures.
  • Including more detail about how we use informational labels, warnings, and opt-in screens.

We're proud to be sharing these refreshed Community Guidelines offering our community much more transparency about our rules and how we enforce them. It takes a whole village to keep people safe online, so we're grateful to everyone in the TikTok community and to all of the external experts who have contributed and continue to help us advance our rules and stay a step ahead of emerging threats.

We believe that everyone deserves to feel safe online, and that feeling safe is key to unlocking imagination and creative expression. That's why we continue to invest in our work to keep TikTok a safe, inclusive and authentic home for our global community, so that they can create, discover and connect.