By Cormac Keenan, Head of Trust & Safety, TikTok

At TikTok, we believe community should be built on a foundation of respect, kindness, and understanding. To help people forge positive digital connections in line with our rules for appropriate behavior on our platform, we strive to empower our community members to stay in control of their interactions with others on TikTok. Today we're announcing updates on our commitment to support the safety of our community and to foster kindness on TikTok, and sharing the next installment of our Community Guidelines Enforcement Report.

Fostering authentic engagement in comments

Alongside our work to proactively remove abusive and hateful content or behavior that violates our Community Guidelines, we also continue to explore new ways to help our community feel more in control over comments. We've started testing a way to let individuals identify comments they believe to be irrelevant or inappropriate. This community feedback will add to the range of factors we already use to help keep the comment section consistently relevant and a place for genuine engagement. To avoid creating ill-feeling between community members or demoralize creators, only the person who registered a dislike on a comment will be able to see that they have done so.

New safety reminders for creators

To make it easier for our community to find and use the built-in safety tools we offer, we're experimenting with reminders that will guide creators to our comment filtering and bulk block and delete options. The reminders will be shown to creators whose videos appear to be receiving a high proportion of negative comments. We will continue to remove comments that violate our Community Guidelines, and creators can continue to report comments or accounts individually or in bulk for us to review.

We will update on the outcome of these tests and whether the feature will roll out in full in the coming weeks. This work is in addition to the many tools already available to our community. For example, community members can choose to filter all comments for manual review, or filter comments that contain keywords they have selected. Creators can also select who can comment on their content from Everyone, Friends (followers who they follow back) or No One. The 'Everyone' setting is not available to those below age 16.

Upholding our Community Guidelines

Another important way we protect the safety of our community is by removing content which violates our Community Guidelines, and today we released our Q4 2021 Community Guidelines Enforcement Report. This report reflects our ongoing commitment to earn the trust of our community by being accountable when it comes to keeping our platform safe and welcoming.

We've continued to expand the information we provide in each report, and since the start of 2021 we've added insight into the volume of content removed at zero views, accounts removed from the full TikTok experience on the suspicion of being under the age of 13, and fake engagement. Starting with this report, we're providing information about content removals in more markets and ongoing improvements to our systems which aim to detect, flag, and, in some cases, remove violative content. These investments have helped meaningfully improve the speed at which we identify and remove violations of our harassment and bullying and hateful behavior policies in particular.

From our first enforcement report in 2021 to this most recent report, we've steadfastly made progress on removing violations before they receive a single view. For instance, from Q1 to Q4 2021, removals of content at zero views improved by 14.7% for harassment and bullying content, 10.9% for hateful behavior, 16.2% for violent extremism, and 7.7% for suicide, self-harm, and dangerous acts.

There's no finish line when it comes to keeping people safe, and our latest report and continued safety improvements reflect our ongoing commitment to the safety and well-being of our community. We look forward to sharing more about our ongoing work to safeguard our platform.