February 27, 2026Safety

Digital Services Act: Our sixth transparency report on content moderation in Europe

With 178 million people in the EU now coming to TikTok every month, keeping our community safe continues to be our top priority. Since 2019, we have been publishing voluntary transparency reports about how we enforce our policies and keep our users safe.

Building on these efforts, and in line with our obligations, under the Digital Services Act (DSA), we are now releasing our sixth DSA transparency report, covering the second half of 2025.

This report details our ongoing content moderation efforts, including violative content removals, illegal content reporting, and user appeals. As part of our commitment to enhanced transparency, we are now expanding our reporting to also include enforcement volumes on comments.

Key insights from this latest report include:

  • Enforcing our rules by removing violative content

We have a number of terms and policies designed to keep users safe from illegal and other harmful content, including our Terms of Service, Community Guidelines, Advertising Policies and TikTok Shop Policies.

Between July and December 2025, we removed around 112 million pieces of content that violated these terms and policies. This total includes Videos, Live streams, Ads, Product Listings and, for the first time, we are reporting the number of Comments that were removed.

  • Automation is increasingly effective at improving speed and consistency of moderation at scale

Automated systems actioned 93.8% of all violating content without human review. These decisions maintained a high degree of precision, with 97.6% of automated enforcement decisions being confirmed as correct.

There is no finish line when it comes to keeping our community safe on our platform. With each report, we reinforce our commitment to transparency and accountability, while continually investing in technologies that help us detect content that violates our policies faster and minimise its potential to be viewed.