By: Sandeep Grover, Head of Trust & Safety Product, & Mabel Wang, Head of Content and Creator Product, TikTok
Today we're announcing a new feature that enables people to refresh their For You feed if their recommendations no longer feel relevant, which will be rolled out over the next few weeks in Singapore. We're also providing another progress update on our ongoing efforts to guard against repetitive recommendations as we continue to balance goals of enabling self-expression and an enjoyable viewing experience.
The option to start fresh on TikTok
On TikTok, For You feeds help people discover a diversity of content, creators, communities, and products.
But we also understand that there are times when people's recommendations don't feel relevant anymore, or provide enough topical variety. So, we're rolling out a way to refresh For You feed recommendations if they no longer feel like they're for you. When enabled, this feature allows someone to view content on their For You feed as if they just signed up for TikTok. Our recommendation system will then begin to surface more content based on new interactions.
This feature adds to a number of content controls our community already has to shape their experience. For example, people can choose to automatically filter out videos that use specific hashtags or phrases from their For You feeds, and say "not interested" to skip future videos from a particular creator or that use a particular sound. Enabling refresh won't override any settings you've already chosen to enable or impact accounts you've followed.
How we work to provide an enjoyable viewing experience
We're constantly working to provide an enjoyable viewing experience, and we take multiple approaches to help safeguard our platform by:
- Removing content we find that breaks our rules, such as content that promotes self-harm or disordered eating
- Making content that's not appropriate for a broad audience ineligible for recommendation into For You feeds
- Minimizing recommendations of topics that could have a negative impact if viewed repeatedly
- Filtering out content with complex or mature themes from teen accounts, powered by our Content Levels system. Using this system, more than 65,000 videos about cosmetic surgery were made ineligible from the viewing experience of teens in the first two months this year
How we address repetitive patterns
An inherent challenge of any recommendation system is ensuring the breadth of content surfaced to a viewer isn't too narrow or too repetitive. We're intently focused on this challenge, and work to design a system that intersperses a variety of topics. For instance, viewers will generally not be served two videos in a row made by the same creator or that use the same sound, and we try to avoid showing people something they've seen before.
In addition, we work to carefully apply limits to some content that doesn't violate our policies, but may impact the viewing experience if viewed repeatedly, particularly when it comes to content with themes of sadness, extreme exercise or dieting, or that's sexually suggestive.
We understand that people express themselves in all sorts of way on TikTok – including when they're feeling down or are going through a difficult life experience. We routinely hear from experts that closing the door on this expression can increase feelings of isolation and stigmatization, and that enabling people to see how others cope with difficult emotions can be beneficial, especially for teens. With this in mind, our approach is to remove content that promotes or glorifies self-injury or our other policies, while allowing recovery or educational content, with limits on how often such recovery or educational content is eligible for recommendation.
Our systems do this by looking for repetition among themes like sadness or extreme diets, within a set of videos that are eligible for recommendation. If multiple videos with these themes are identified, they will be substituted with videos about other topics to reduce the frequency of these recommendations and create a more diverse discovery experience. This work is ongoing, and over the last year alone, we've implemented over 15 updates to improve these systems, along with expanding to support more languages. Our trust and safety and product teams partner to drive this work, which is informed by academic literature and consultation with experts, such as the International Association for Suicide Prevention and the Digital Wellness Lab at Boston Children's Hospital. We'll continue these efforts as we strive to recommend a diversity of content to enable an enriching discovery experience. We are determined to provide both a welcoming space for self-expression and an enjoyable environment for our community.