By Eric Han, Head of Safety, TikTok US

TikTok is a platform that brings joy to our millions of users, and we are committed to developing industry-leading policies and tools that also make TikTok the safest platform for those users.

Our approach to user safety spans policies, practices, product, people, and partners. Today, we want to outline some of our protective measures in order to provide transparency on our practices and share our efforts to prevent and remove Child Sexual Abuse Material (CSAM). 

This area is challenging but critically important for our industry, but TikTok has a zero tolerance policy for predatory or grooming behavior.

Policies

We prohibit content that depicts or disseminates child abuse, child nudity, or sexual exploitation of children in digital or real world format. If we become aware of any such content, which is in direct violation of our Terms of Service and Community Guidelines, we will take immediate action to remove content, terminate accounts, and report cases to NCMEC and law enforcement as appropriate.  

Protecting against the threat of online child sexual exploitation and abuse is an issue that requires a global response, including collaboration between governments and industry, and the sharing of skills and resources to support a safe online environment. With this in mind, we're committed to promoting and supporting the implementation the Voluntary Principles to Counter Online Child Sexual Exploitation and Abuse, adopted by Five Eyes alliance – a collaboration between intelligence agencies in the US, Canada, UK, Australia, and New Zealand. These Voluntary Principles outline a framework which can be consistently applied across sectors and services to respond to changing societal and offending behaviors  and reduce risks for users. This  is one of our many efforts to protect children both online and off.

People

Our global safety teams comprise experienced industry professionals whose backgrounds span product, policy, compliance, child safety, law, privacy, and NGOs. Regional Trust & Safety hubs in California, Dublin, and Singapore oversee the development and execution of moderation policies and work to localize in each market. The hubs are focused on strengthening policies, technologies, and moderation strategies and ensuring that they complement both local culture and context. Our Trust & Safety leaders collaborate closely with regional regulators, policymakers, government, and law enforcement agencies in our pursuit to promote the highest possible standard of user safety. 

Practices

We employ human and machine-based moderation tools like photo identification technologies, in alignment with industry standards, to identify and remove exploitative content. In addition, we filter red-flag language and share information with NCMEC about situations that may indicate grooming behavior, according to their policies and industry norms.

Product

TikTok is an app for users 13 and over, and along with an age-gate, we've given the app a 12+ App Store rating so that parents can enable device-level restrictions on their child's phone. In the US, we accommodate users under the age of 13 in a limited app experience that introduces additional safety and privacy protections designed specifically for a younger audience, in line with industry practice for mixed audience apps of splitting users into age-appropriate environments.

Unlike other platforms, we don't permit images or videos to be sent in comments or messages. This was a deliberate decision on our part: studies have shown that a proliferation of Child Sexual Abuse Material has been linked and spread via messaging. TikTok was built to provide a positive place for creativity and we prioritize the safety of our users.  From the very beginning we chose not to allow users to upload photos or videos to their messages. 

At TikTok, we do not want bad actors in our ecosystem and neither do our users, which is why we encourage our community to report content or accounts that may be in violation or our guidelines. In addition to reporting, we've built numerous controls into the app for users and families – such as the ability to make an account private, restrict who can engage with content, filter comments, disable messages, or block another user.

We explain our educational resources in our Safety Center and our educational safety videos to provide users with detailed instructions on how to enable TikTok's tools and controls.

Partners

We are committed to building a positive environment for our users while protecting against industry-wide challenges around platform misuse, and we work with leading organizations including the Family Online Safety Institute, ConnectSafely, and the Internet Watch Foundation to help ensure that our policies, technology, privacy controls, and the user education we provide continue to promote a safe and welcoming environment for our community.

While we would like to share our full approach, as an industry, we are mindful that doing so runs the risk of tipping off would-be bad actors which would help them circumvent our measures. There is no finish line with safety at TikTok. We work each day to ensure that we are continuing to learn, adapt, and grow our policies and practices to keep our community safe.