By Cormac Keenan, Head of Trust and Safety

TikTok's diverse community in Singapore and across the world transcends generations, spanning from teens to grandparents and everyone in between. We build with these different audiences in mind by limiting features by age, empowering our community with content controls, and supporting families with parental controls. Today, we're announcing new features and technologies that will help viewers customize their viewing preferences and continue to have a safe and entertaining experience on TikTok.

A new tool to customize content

Part of what makes the TikTok experience unique is the ability for people to discover new interests, creators, and ideas. We design our recommendation system with safety in mind, since content in someone's For You feed may come from a creator they prefer not to follow or relate to an interest they may not share. For instance, certain categories of content may be ineligible for recommendation, and viewers can use our "not interested" feature to automatically skip similar videos in the future.


To further empower viewers with ways to customize their viewing experience, we're rolling out a tool people can use to automatically filter out videos with words or hashtags they don't want to see in their For You or Following feeds - whether because you've just finished a home project and no longer want DIY tutorials or you want to see fewer dairy or meat recipes as you move to more plant-based meals. This feature will be available to everyone in the coming weeks.



An update on our efforts to diversify recommendations

We want to play a positive role in the lives of the people who use our app, and we're committed to fostering an environment where people can express themselves on a variety of topics, while also protecting against potentially challenging or triggering viewing experiences. Last year, we began testing ways to avoid recommending a series of similar content on topics that may be fine as a single video but potentially problematic if viewed repeatedly, such as topics related to dieting, extreme fitness, sadness, and other well-being topics. We've also been testing ways to recognize if our system may inadvertently be recommending a narrower range of content to a viewer.

As a result of our tests and iteration in the US, we've improved the viewing experience so viewers now see fewer videos about these topics at a time. We're still iterating on this work given the nuances involved. For example, some types of content may have both encouraging and sad themes, such as disordered eating recovery content. We're also training our systems to support new languages as we look to expand these tests to more markets in the coming months. Our aim is for each person's For You feed to feature a breadth of content, creators, and topics they'll love.


Introducing Content Levels to help further safeguard the viewing experience

Whether people come to TikTok to search for recipe inspiration, have a laugh after a long day at work, or discover new interests through their For You feed, we continually work to create a safe and enjoyable viewing experience. Our Community Guidelines are central to this by setting the standards for what is and is not allowed on our platform. Within these strict policies, we understand that people may want to avoid certain categories of content based on their personal preferences. Or, for our teenage community members, some content may contain mature or complex themes that may reflect personal experiences or real-world events that are intended for older audiences.

Recognising this, we are working to build a new system to organise content based on thematic maturity. Many people will be familiar with similar systems from their use in the film industry, television, or gaming and we are creating with these in mind while also knowing we need to develop an approach unique to TikTok.


In the coming weeks, we'll begin to introduce worldwide, including Singapore, an early version to help prevent content with overtly mature themes from reaching younger audiences under the age of 18. When we detect that a video contains mature or complex themes, for example, fictional scenes that may be too frightening or intense for younger audiences, a maturity level will be allocated to the video to help prevent those under 18 from viewing it across the TikTok experience. We have focused on further safeguarding the teen experience first and in the coming months we plan to add new functionality to provide detailed content filtering options for our entire community so they can enjoy more of what they love.



Building for the needs of our global community

As we continue to build and improve these systems, we're excited about the opportunity to contribute to long-running industry-wide challenges in terms of building for a variety of audiences and with recommender systems. We also acknowledge that what we're striving to achieve is complex and we may make some mistakes. Our focus remains to create the safest and most enjoyable experience for our community and we will continue to listen to feedback from our community and continue consulting with independent experts, including our Asia-Pacific Safety Advisory Council.