By Tracy Elizabeth, Global Minor Safety Policy Lead, TikTok
Every day, creators of all backgrounds come to TikTok to connect with a global community and bring joy to hundreds of millions of people around the world. From parents revealing their best tips (#momhack has 1.8B views) to Canadians sharing why they are receiving the COVID-19 vaccine, TikTok is home for a diverse range of communities to express themselves creatively and be entertained.
And every day, thousands of safety professionals at TikTok are tasked with the critical job of helping keep our community safe. One part of this work involves designing strategies and tools so that only those who are old enough have a TikTok account. In this post, we'll explain more about how we work to keep TikTok a place for people 13 and older and identify those who may have mistakenly registered on our platform.
Preventing underage people from signing up
TikTok has a 12+ rating in the App Store, which lets parents use device-level controls to block their teen from downloading the app. To help keep people from using TikTok if they're not yet old enough to do so, we've designed a neutral, industry-standard age-gate that requires people to fill in their complete birthdate to discourage people from simply clicking a pre-populated minimum age. If someone does not meet our minimum age requirement, we suspend their ability to attempt to re-create an account using a different date of birth.
Removing suspected underage accounts
While most people understand the importance of being truthful about their age, some do not provide the correct information, which is a challenge many online services face. That's why our commitment to enforcing our minimum age requirements does not end at the age gate, and we take a number of additional approaches to identify and remove suspected underage account holders. First, we train our safety moderation team to be alert to signs that an account may be used by a child under the age of 13. We also use other information as provided by our users, such as keywords and in-app reports from our community, to help surface potential underage accounts. When our safety team believes that an account may belong to an underage person the account will be suspended.
Bringing more transparency to our actions
In our last Transparency Report we published the number of accounts removed for violating our Community Guidelines. To bring more visibility to the actions we take to protect minors, we'll also begin sharing similar information regarding removals of suspected underage accounts, starting with our next report. We've continued to expand the information provided in these reports, such as publishing the volume of reinstated content that was appealed, to help the industry move forward by leading the way when it comes to transparency and accountability around user safety.
Creating an age-appropriate environment
In addition to our work to prevent underage people from using our service, we've introduced meaningful tools and policies designed to promote a safe and age-appropriate experience for teens 13-17. For example, when younger teens start using TikTok, we intentionally restrict access to some features, such as LIVE and Direct Messaging, and automatically set accounts of users ages 13-15 to private by default. These are deliberate decisions we've taken to protect younger members of the community as they start to build their online presence. We also aim to provide parents with resources they can use to have conversations about digital safety and decide the most comfortable experience for their family, including our Family Pairing features and our new Guardian's Guide to TikTok.
We take our responsibility to protect our community incredibly seriously. It is the most important work we do, and we will continue to innovate to keep our community safe. We look forward to sharing more about our work in the future. To learn more about the safety features and resources we offer teens and families, check out our Safety Centre.