At TikTok, we aim to build responsibly and equitably so our community can safely create, share, and enjoy creative and entertaining content on our app. We work to earn and maintain trust through ongoing transparency into the actions we take to safeguard our platform, because we know that just saying "trust us" is not enough. That's why long ago we made an important commitment to transparency, particularly when it comes to how we moderate and recommend content.

To fulfill this commitment, we've continually taken concrete actions to earn people's trust. To name a few examples:

  • Two years ago we launched our industry-leading Transparency and Accountability Centers, which raised the bar by allowing experts to access our moderation practices and information about our recommendation system. We've hosted hundreds of guests virtually since opening during the pandemic, and look forward to opening our physical centers soon.
  • We established our US Content Advisory Council in 2020, and have since created additional Safety Advisory Councils in Europe, the Middle East and North Africa, Asia Pacific, Brazil, and Latin America. The councils are comprised of independent, industry-leading experts in each region who advise on a range of content policies and safety strategies.
  • We recognize there's no one-size-fits-all solution when it comes to having a great TikTok experience, so we've continued to provide options for people to customize their preferences. For example, we recently announced a new way to filter keywords and hashtags associated with content you don't want to see in your For You and Following feeds, and we'll roll out additional features and tools in the months ahead.


We aim to surpass the high expectations our community and stakeholders rightly have for us so we can continue to serve everyone who creates, connects, and is entertained on TikTok. We've been listening to feedback from different communities of researchers, academics, and experts, and are today sharing new initiatives to strengthen transparency and accountability of our platform.

  • Providing API access to research the TikTok platform. Currently researchers do not have easy and accurate ways to identify and assess content and trends or conduct tests of our platform. We're developing a research API to improve ease of access to public and anonymized data about content and activity on our platform, and plan to make it available to selected researchers later this year.
  • Providing API access to research our moderation system. In addition, we've developed a moderation system API which we plan to make available this fall at our Transparency and Accountability Centers. This moderation system API will provide selected researchers an effective way to evaluate our content moderation systems and examine existing content available on our platform. In our Transparency and Accountability Centers, researchers will also be able to upload their own content to see how different types of content are either permitted, rejected, or passed to moderators for further evaluation.
  • Deepening information sharing with our Content and Safety Advisory Councils. The independent experts on our US Content Advisory Council and regional Safety Advisory Councils will also be granted API access as well as access to confidential information, such as our keyword lists (which are used to help detect and flag potentially violative content) for deeper analysis. We don't make keyword lists available publicly in order to avoid providing a roadmap for bad actors who attempt to subvert our safeguards. While we have dedicated teams regularly stress-testing our processes and tools to ensure they're robust and effective, we know that perspectives and insights from experts can strengthen our approach.
  • Expanding our transparency reports with information about countering covert influence operations. Part of what makes TikTok unique is the overwhelming authenticity of our creators and their content. We don't allow activities that may undermine the integrity of our platform or the authenticity of our users. Going forward, in our quarterly Community Guidelines Enforcement Reports, we'll publish insights about the covert influence operations we identify and remove from our platform globally to show how seriously we take attempts to mislead our community.


These initiatives are well underway and will be launching over the coming months this year. We'll update on our progress as we continue to innovate when it comes to being transparent and accountable. If you have feedback or ideas you want us to consider, please email transparency [at] tiktok [dot] com.