By Michael Beckerman, Vice President, Head of US Public Policy, TikTok

At TikTok, we believe that accountability and transparency are essential to facilitating trust with our community – and we're committed to leading the way when it comes to being transparent in how we operate, moderate and recommend content, and secure our platform. That's why we opened our global Transparency and Accountability Centers for experts and lawmakers to see first-hand how we're working to build a safe and secure platform for our growing and diverse community. 

Due to physical constraints around the ongoing COVID-19 pandemic, opening the center has meant evolving our original plans to offer a virtual experience until we can welcome guests to the full tour at our physical locations in Los Angeles and Washington, D.C. To date nearly two dozen experts and Congressional offices have visited virtually, allowing them to learn and ask questions about our safety and security practices.  

Content moderation

In our virtual experience we provide extensive detail on how our moderators review content and accounts that are escalated via user reports and technology-based flagging. This includes walking visitors through our safety classifiers and deep learning models that work to proactively identify harmful content and our decision engine that ranks potentially violating content to help moderation teams review the most urgent content first. We also demonstrate our object detection models that flag things like hate symbols to our human moderators for further review. At our physical centers, guests will be able to sit in the seat of a content moderator, use our moderation platform, review and label sample content, and experiment with various detection models. 


During the virtual experience, we also share details about how our recommendation system operates and how we consider user experience and safety in building recommendations. This includes sharing how our systems work to diversify content in the For You feed and additional insights into how content is tailored to the preferences of each user. Finally, we review some of our efforts to address the common challenges of recommendation engines.

Privacy & Security 

Under the leadership of our Chief Security Officer, Roland Cloutier – who has decades of experience in law enforcement and the financial services industry – our teams work to protect our community's information and stay ahead of evolving security challenges. Our virtual experience shows visitors how we store TikTok user data in the US and safeguard that data from hackers and other threats with encryption and the latest technology. We also detail how we work with industry-leading third party experts – the same ones used by the federal government – to test and validate our security processes. In our physical centers, we'll also give experts a first-hand look at how we keep our platform safe 24 hours a day, seven days a week through a cutting-edge fusion center. 

Our virtual Transparency and Accountability Center builds on work we're already doing to increase visibility into how our platform operates, including publishing Transparency Reports, launching a Transparency hub, sharing more about how we recommend content, and more.

We believe all companies should disclose their algorithms, moderation policies, and data flows to regulators, and we hear from lawmakers that that's what they want, too. We're proud to be the first to take this important step toward an unprecedented level of accountability and transparency. 

If you’re a lawmaker or policy, content safety, or security expert looking for more information about our virtual tours, please email transparency [at] tiktok [dot] com.