Author: Julie de Bailliencourt, Global Head of Product Policy, TikTok

TikTok is an entertainment platform powered by the creativity, self-expression, and heart that creators put into making authentic content. Our Community Guidelines establish and explain the behaviors and content that are not allowed on our platform, and when people violate our policies, we take action on their content and, when warranted, their account so we can keep our platform safe.

Most of our community members aim to follow our policies, but there is a small minority of people who repeatedly violate our policies and don't change their behavior. Today, we're announcing an updated system for account enforcement to better act against repeat offenders. We believe that these changes will help us more efficiently and quickly remove harmful accounts, while promoting a clearer and more consistent experience for the vast majority of creators who want to follow our policies.

Why we're updating the current account enforcement system

Our existing account enforcement system leverages different types of restrictions, like temporary bans from posting or commenting, to prevent abuse of our product features while teaching people about our policies in order to reduce future violations. While this approach has been effective in reducing harmful content overall, we've heard from creators that it can be confusing to navigate. We also know it can disproportionately impact creators who rarely and unknowingly violate a policy, while potentially being less efficient at deterring those who repeatedly violate them. Repeat violators tend to follow a pattern – our analysis has found that almost 90% violate using the same feature consistently, and over 75% violate the same policy category repeatedly. To better address this, we're updating our account enforcement system as we look to support our creator community and remove repeat offenders from our platform.

How the streamlined account enforcement system will work

Under the new system, if someone posts content that violates one of our Community Guidelines, the account will accrue a strike as the content is removed. If an account meets the threshold of strikes within either a product feature (i.e. Comments, LIVE) or policy (i.e. Bullying and Harassment), it will be permanently banned. Those policy thresholds can vary depending on a violation's potential to cause harm to our community members – for example, there may be a stricter threshold for violating our policy against promoting hateful ideologies, than for sharing low-harm spam. We will continue to issue permanent bans on the first strike for severe violations, including promoting or threatening violence, showing or facilitating child sexual abuse material (CSAM), or showing real-world violence or torture. As an additional safeguard, accounts that accrue a high number of cumulative strikes across policies and features will also be permanently banned. Strikes will expire from an account's record after 90 days.

Helping creators understand their account status

These changes are intended to drive more transparency around our enforcement decisions and help our community better understand how to follow our Community Guidelines. To further support creators, over the coming weeks we'll also roll out new features in the Safety Center we provide to creators in app. These include an "Account status" page where creators can easily view the standing of their account, and a "Report records" page where creators can see the status of reports they've made on other content or accounts. These new tools add to the notifications creators already receive if they've violated our policies, and support creators' ability to appeal enforcements and have strikes removed if valid. We'll also begin notifying creators if they're on the verge of having their account permanently removed.



Making consistent, transparent moderation decisions

As a separate step toward improving transparency about our moderation practices at the content level, we are also beginning to test a new feature in some markets that would provide creators with information about which of their videos have been marked as ineligible for recommendation to For You feeds, let them know why, and give them the opportunity to appeal.

Our updated account enforcement system is currently rolling out globally. We'll notify community members as this new system becomes available to them. We will continue evolving and sharing progress around the processes we use to evaluate accounts and assure accurate, nuanced enforcement decisions for accounts of all kinds across our platform.