By Suzy Loftus, Head of USDS Trust and Safety

With more than 2 billion people in over 50 countries expected to go to the polls this year, we are deeply invested in protecting the integrity of elections on TikTok. Over the last 4 years, we've worked to protect our platform through over 150 elections around the world. Today, we're sharing an overview of our continued investment to ensure that TikTok continues to be a creative, safe, and civil place for our community in a historic elections year.

Connecting people to trusted information

We partner with electoral commissions and fact-checking organizations to build Election Centers that connect people to trustworthy information about voting. Our local Election Centers reached over 55 million people globally last year. In the coming days, we will launch our US Elections Center, in partnership with nonprofit Democracy Works. The Center will provide our 150M+ US community members with reliable voting information for all 50 states and Washington, DC. We will direct people to the Elections Center through prompts on relevant election content and searches. We'll continue to add information throughout the year, including election results.



Media literacy is also integral to our elections strategy, and we collaborated with local creators and civil society to promote media literacy best practices that reached millions more people on TikTok last year. Throughout 2024, we'll continue to partner with experts and fact-checking organizations around the world to deliver engaging media literacy campaigns about misinformation, identifying AI-generated content, and more.

In addition, we're continuing to build features that provide additional context about content and accounts on TikTok. For example, blue "verified" checks confirm that notable accounts are who they say, and in the U.S., we require government, politician, and political party accounts to be verified. We label content that our fact-checkers determine as unsubstantiated and will expand media literacy resources to these labels this year. You can read more on our approach to election integrity at our new Election Integrity Transparency Center. 

Moderating content and accounts during elections

Thousands of trust and safety professionals work alongside technology to enforce our Community Guidelines. We are committed to consistently enforcing our rules to fight misinformation, covert influence operations, and other content and behavior that platforms see more of during elections.

  • Countering misinformation: We invest in media literacy as a counter-misinformation strategy as well as technology and people to fight misinformation at scale. This includes specialized misinformation moderators with enhanced tools and training, and teams on the ground who partner with experts to prioritize local context and nuance. We partner with 17 global fact-checking organizations, who assess the accuracy of content in over 50 languages so that our moderators can apply our misinformation policies accordingly. We added three new global fact-checking partners in 2023, and will continue to expand our fact-checking program this year.
  • Deterring covert influence operations: We know that deceptive actors try to target online platforms during elections, and we remain vigilant against covert influence operations. We have dedicated experts working to detect, disrupt, and stay ahead of deceptive behaviors. We report the removals of covert influence networks in our quarterly Community Guidelines Enforcement Reports. In the coming months, we'll introduce dedicated covert influence operations reports to further increase transparency, accountability, and sharing with the industry. We provide information about how we assess this behavior at our Transparency Center.
  • Tackling misleading AI-generated content: AI-generated content (AIGC) brings new challenges around misinformation in our industry, which we've proactively addressed with firm rules and new technologies. We don't allow manipulated content that could be misleading, including AIGC of public figures if it depicts them endorsing a political view. We also require creators to label any realistic AIGC and launched a first-of-its-kind tool to help people do this. As the technology evolves in 2024, we'll continue to improve our policies and detection while partnering with experts on media literacy content that helps our community navigate AI responsibly.
  • Tailoring our approach to political and news accounts: For years, TikTok has not allowed paid political advertising, and accounts belonging to politicians or political parties are not able to advertise or make money on TikTok. We also recognize that accounts belonging to news organizations, politicians, political parties and governments play a unique role in civic discourse, and apply more nuanced account enforcement policies to protect the public interest, which we explain in our Community Guidelines.

We've consulted with more than 50 experts as our industry prepares for this historic election year, including our Safety and Content Advisory Councils. For more on how we combat misinformation, see our newly expanded Transparency Center page.