Cormac Keenan, Head of Trust and Safety, TikTok

This article is part of a series about the fundamentals of safety, focusing on some of the behind-the-scenes work of TikTok's safety teams.

At TikTok, thousands of people are focused on helping to make our platform safe for our community to explore entertaining content and share their creativity. The Trust and Safety team at TikTok is focused on carrying out a variety of tasks to protect our community. As we continue on our safety journey, we want to be open and transparent along the way, and we'll talk through some of this work here.

Creating and updating our Community Guidelines

Our Community Guidelines define a set of norms and common code of conduct for TikTok; they provide guidance on what is and is not allowed to help maintain a welcoming space. Our policy experts are responsible for constantly assessing these guidelines to consider how we can enable creative expression while protecting against potential harms.

These experts are often subject matter specialists - some have expertise working in the technology industry, others may join us from civil society or government. Their roles involve grappling with complex and challenging areas - for example, where should we draw the line on content related to eating disorders. From our work with outside experts, we know that eating disorder-related content can be damaging, but crucially, content that focuses on recovery can have a positive impact and as we refine our policies it's important that we continue to reflect nuances such as this.

Detecting potential harms and enforcing our policies

Product and Process teams are focused on designing strategies and techniques to more efficiently detect potential harms and enforce our Community Guidelines at scale. One example is how we use 'hashing' technology to create a unique digital identifier of an image or video. In line with industry standards, this enables us to mark a known harmful piece of content as violating and more easily remove it at scale. For example, if content is removed for breaking our policies that protect against the sharing of child sexual exploitation images, the unique identifier would help to find and remove matching content, and prevent repeated uploads.

As we explain in our latest Transparency Report, more than 80% of violating videos were removed before they received a single view, and we're committed to continuing to develop our effectiveness in this area. While technology can help to remove clear-cut violations, an important part of content moderation involves human review. No matter the time of day, if content is reported, our teams are on standby to help take action. Through this additional layer of human review, we can also improve our machine learning systems as moderators provide feedback to the technology, helping to capture emerging content trends and improve our future detection capabilities.

Another essential role of our team is to identify and analyse content to continuously improve the accuracy of our content moderation processes. For example, if a video is becoming popular, it may be reviewed again by our systems to reduce the potential that violating content remains on TikTok. Additionally, we analyse content moderation decisions to understand why violating content may not have been caught at an earlier stage, and to identify trends of violating content on our platform. For example, we might learn that we need to do more work to develop technology to automatically detect certain types of potential violations. On other occasions, we might find that we need to facilitate additional targeted training sessions for our moderation teams to help drive a better understanding of certain policies and nuances, with the aim of improving correct decision-making during the review process.

Focusing on employee wellbeing

Building and maintaining a safe experience for our community is our team's most important role. At times, this means our moderators may be required to review potentially harmful content which makes providing the right support essential. We recognise this, and we're focused on prioritising the health, safety, and wellbeing of our people. We provide our teams with access to wellbeing programs, including training, evidence-based resources, and professional counselling. In addition, we conduct regular analysis to understand how we can continue to improve and it's our hope that we can lead our industry by providing the most ambitious and effective support structures for our people.

As we continue on our journey to help make TikTok a safe place where joy and creativity can thrive, we're looking forward to sharing more about the work of our Trust and Safety teams.