By: Helena Lersch, Director of Public Policy, SEA and MENAT

TikTok is a diverse, global community fueled by creative expression. We work to maintain an environment where everyone feels safe to create, find community, and be entertained. We are committed to being transparent about how we keep our platform safe, because it helps build trust and understanding with our community. Today, we're taking another step to be accountable to this commitment by releasing our fourth global Transparency Report.

This Transparency Report covers the second half of 2020 (July 1 - December 31) and provides visibility into the volume and nature of content removed for violating our Community Guidelines or Terms of Service, with additional insight into our work to counter misinformation related to COVID-19 and elections. It also includes how we respond to law enforcement requests for information, government requests for content removals, and intellectual property removal requests.

We've added a number of new data to this report in our effort to be ever more transparent, including:

  • Accounts removed
  • Spam accounts and videos removed
  • Videos restored after they were appealed by the video's creator
  • Specific policy insights
  • Ads rejected for violating our advertising policies

Here are some of the key insights from the report, which you can read in full here.


Community Guidelines enforcement overview

  • 89,132,938 videos were removed globally in the second half of 2020 for violating our Community Guidelines or Terms of Service, which is less than 1% of all videos uploaded on TikTok.
  • 92.4% of these videos were removed before a user reported them, 83.3% were removed before they received any views, and 93.5% were removed within 24 hours of being posted.
  • 6,144,040 accounts were removed for violating our Community Guidelines.
  • 9,499,881 spam accounts were removed along with 5,225,800 spam videos posted by those accounts. We prevented 173,246,894 accounts from being created through automated means.
  • 3,501,477 ads were rejected for violating advertising policies and guidelines.


Countering COVID-19 and vaccine misinformation

TikTok continues to work with public health experts to help our community stay safe and informed on COVID-19 and vaccines. We make public health information available throughout our app as we also work to counter misinformation. Here are some results from these efforts during the second half of 2020:

  • Our COVID-19 information hub was viewed 2,625,049,193 times.
  • Banners directing viewers to the COVID-19 information hub were added to 3,065,213 videos.
  • Public service announcements (PSAs) on hashtags directing users to the WHO and local public health resources were viewed 38,010,670,666 times.
  • 51,505 videos were removed for promoting COVID-19 misinformation. Of these videos, 86% were removed before they were reported to us, 87% were removed within 24 hours of being uploaded to TikTok, and 71% had zero views.


Maintaining platform integrity through the US 2020 elections

Though political videos make up a smaller amount of overall content on TikTok, and we don't accept paid political ads, we work to keep TikTok free of election misinformation while also providing access to authoritative information about civic processes. In the second half of 2020, we worked to safeguard the integrity of elections globally.

In the US, our team of safety, security, policy, and operations experts work each day to detect and stop the spread of election misinformation and other content that violates our Community Guidelines. Our teams are supported by automated technology that identifies and flags content for review as well as industry-leading threat intelligence platforms that escalate content emerging across the internet and on our platform. Here are some of the results from this work in the last half of 2020:

  • Our 2020 US elections guide with authoritative information about voting, the elections, and results was visited 17,995,580 times.
  • PSAs on election-related hashtags reminded people to follow our Community Guidelines, verify information, and report content, and were viewed 73,776,375,496 times.
  • As the majority of content people see on TikTok comes through their For You feed (which recommends videos regardless of when they were posted) banners were added on 6,978,395 election-related videos that directed viewers to the elections guide for up-to-date information and results.
  • 347,225 videos were removed in the US for election misinformation, disinformation, or manipulated media.
  • We work with fact checkers to help us verify the accuracy of content and limit distribution of unsubstantiated content. As a result, 441,028 videos were not eligible for recommendation into anyone's For You feed.


What we think worked

  1. Our proportionate focus on both foreign and domestic threats to our platform and overall elections integrity during the US 2020 elections was the right approach. We started our elections preparations in 2019 and built defenses based on industry learnings from the US 2016 elections, but we also prepared for more domestic activity based on trends we've observed on how misleading content is created and spread online.
  2. We made the correct tooling investments that allowed us to quickly and meaningfully reduce the discoverability of disinformation and terms of incitement. We moved to quickly redirect misleading hashtags to our Community Guidelines instead of showing results, such as #sharpiegate #stopthesteal #patriotparty. This approach has also helped us combat QAnon content, though we continually must update our safeguards as content and terminology evolves.
  3. Prioritizing faster turnaround times for fact-checking helped us make informed and quick decisions on emerging content.
  4. Our investment in building relationships with a range of experts improved our overall approach to platform integrity, from policies to enforcement strategies to product experiences in our app.


What we can improve

  1. We will keep improving our systems to proactively detect and flag misleading content for review. For instance, we can immediately detect known disinformation using our disinformation hashbank, and we're working to advance our models so that we can better identify altered versions of known disinformation.
  2. We will continue to develop our system that prevents repeat offenders from circumventing our enforcement decisions.
  3. More investment is needed to educate creators and brands on disclosure requirements for paid influencer content. TikTok does not allow paid political ads, and that includes content influencers are paid to create, and we expect our community to abide by our policies and FTC guidelines.
  4. We were proud of the in-app elections guide we developed with experts, and in the future we would launch it sooner in the elections process.


This is our most comprehensive report to-date, and we're proud of the progress we've made to increase transparency into our content and moderation practices. We'll continue to listen to feedback from our community and share our progress as we work to earn the trust of our community.