January 16, 2026Safety

Updates to how we enforce age appropriate experiences in Europe

Across the world, many organisations, policymakers, and companies are continuing to try to solve the complex challenge of online age assurance. Despite best efforts, there remains no globally agreed-upon method for effectively confirming a person's age in a way that also preserves their privacy. At TikTok, we're committed to keeping children under the age of 13 off our platform, providing teens with age-appropriate experiences, and continuing to assess and implement a range of solutions. We believe that a multi-layered approach to age assurance - one in which multiple techniques are used - is essential to protecting teens and upholding safety-by-design principles.

Multiple checks help to make sure an account isn't used by someone underage

The minimum age to use TikTok is 13, and we already take a multi-layered approach to detect and confirm when people may not have provided their correct date of birth.

  1. When people create an account, they must add their birthdate. This process, referred to as age gating, is neutral. This means we don't hint or nudge people towards entering the 'right' age. If someone fails to meet our minimum age, we suspend their ability to immediately re-create an account using a different date of birth.
  2. Our age gate is just one of our multiple checks. We also use technology that looks at information, often called 'signals', to check for indicators that someone may not meet our minimum age requirement.
  3. We train our moderation teams to be alert to signs that an account may be used by a child under the age of 13. If they're reviewing content for another reason but suspect an account belongs to an underage user, they can send it to our specialised review team with deeper expertise on age assurance.
  4. We allow anyone to report an account they believe belongs to someone under 13. A TikTok account is not required to make a report.

By combining age detection methods together, we remove around 6 million underage accounts globally every single month.

Utilising enhanced technology to support underage detection in Europe

In the coming weeks, we will begin to roll out enhanced technology in Europe* to further support how our moderation teams detect and remove accounts that belong to someone under the age of 13. This follows an initial pilot in Europe* over the last year, which led to the removal of thousands of additional underage accounts. To make this prediction, our technology uses information an account holder provides about themselves, such as their profile information, the videos they publish, and other on-platform behaviour.

When our technology identifies that an account may belong to someone under 13, it will be reviewed by a specialist moderator who decides whether it should be banned. Like today, everyone has the opportunity to submit an appeal if they believe an error was made. We offer a range of methods that enable people to confirm their age during the appeal process. This includes:

  • Facial age estimation provided by Yoti.
  • Credit card authorisation.
  • Providing government-approved identification.


Prioritising safety and privacy

Protecting people's privacy has been central to our work on this project. By integrating data protection principles into the design phase of technology from the beginning, it ensures that the prediction of the likelihood that someone is under the age of 13 is not used for purposes other than to decide whether to send an account to human moderators and to monitor and improve the technology. By adopting this approach, we are able to deliver safety for teens in a privacy-preserving manner. We have also consulted extensively with TikTok's lead privacy regulator in the EU, the Data Protection Commission (DPC), which enabled us to listen to feedback and help ensure compliance with Europe's high data protection standards. Members of our community in Europe will also receive a notification to let them know that this technology is launching and they will be able to learn more about it.

Enforcing age-appropriate experiences

For teens who have passed our age checks, we are committed to ensuring that they have an age-appropriate experience. That's why on TikTok, teen accounts have more than 50 preset safety, privacy, and security features and settings automatically turned on. For instance, people must be at least 16 years old to use our direct messaging feature. Every teen under 18 has a screen time limit automatically set to 60 minutes by default and won't receive notifications after bed time. Additionally, content that we identify as not suitable for teens is filtered from their experience.

To help ensure that teens are in the right experience for their age, we also use technologies that help predict whether a person falls within a certain age range (e.g., 13-15). If the estimate doesn't match the date of birth provided at sign-up, a moderator can review the account and place it into the correct age experience.

We take our responsibility to protect our community, and teens in particular, incredibly seriously. It is the most important work we do, and we will continue to innovate to keep our community safe. To learn more about the safety features and resources we offer teens and families, check out our Safety Centre.

*This includes the EEA, Switzerland, and the UK.