With more than 2 billion people in over 50 countries expected to go to the polls this year, we are deeply invested in protecting the integrity of elections on TikTok. Over the last 4 years, we've worked to protect our platform through more than 150 elections around the world.

In the UK, with both Local Elections and a General Election taking place within the next year, we wanted to share an overview of how we're investing to ensure that TikTok continues to be a creative, safe, and civil place for our community in a historic election year.

Connecting people to trusted information

In early April, we are launching our in-app Local Election Centre in collaboration with Logically Facts, our fact-checking partners that specialise in analysing and fighting disinformation.

The Local Election Centre will provide our community members with reliable voting information including, amongst other things, links to resources from the Electoral Commission, the independent body which oversees elections in the UK. When on TikTok, people will be directed to the Election Centre through prompts on relevant election content and searches, and we'll continue to update throughout this period.

While this will be tailored and specific to the UK and the upcoming local elections, we have partnered with electoral commissions and fact-checking organisations for previous and upcoming elections across the globe, with our local Election Centres reaching over 55 million people globally last year.


Media literacy is also integral to our elections strategy.

We collaborated with Logically Facts to produce educational videos for users to think critically about the information they consume. These videos will be featured on the election centre and throughout 2024, we'll also be continuing to partner with experts and fact-checking organisations around the world to deliver engaging media literacy campaigns about misinformation, identifying AI-generated content, and more.

In addition, we're continuing to build features that provide additional context about content and accounts on TikTok. For example, blue "verified" checks confirm that notable accounts are who they say and we label content that our fact-checkers determine as unsubstantiated.

You can read more on our approach to election integrity at our new Election Integrity Transparency Centre.

How we moderate content and accounts during elections
Thousands of trust and safety professionals work alongside technology to enforce our Community Guidelines. We are committed to consistently enforcing our rules to fight misinformation, covert influence operations, and other content and behaviour that platforms see more of during elections.

  • Tailoring our approach to political and news accounts: For years, TikTok has not allowed paid political advertising, and accounts belonging to politicians or political parties are not able to advertise or make money on TikTok. We also recognise that accounts belonging to news organisations, politicians, political parties and governments play a unique role in civic discourse, and apply more nuanced account enforcement policies to protect the public interest, which we explain in our Community Guidelines.
  • Tackling misleading AI-generated content: AI-generated content (AIGC) brings new challenges to our industry, which we've proactively addressed with firm rules and new technologies. We don't allow manipulated content that could be misleading, including AIGC of public figures if it depicts them endorsing a political view. We also require creators to label any realistic AIGC and launched a first-of-its-kind tool to help people do this. Alongside 20 other leading tech companies, we recently pledged to help prevent deceptive AI content from interfering with this year’s elections through proactive collaboration. As technology evolves in 2024, we'll continue to improve our policies and detection while partnering with experts on media literacy content that helps our community navigate AI responsibly, including working with industry through content provenance partnerships.
  • Countering misinformation: In Q4 2023, 99% of all the content we removed for election and civic misinformation was taken down before it was reported to us. We invest in media literacy as a counter-misinformation strategy as well as technology and people to fight misinformation at scale. This includes specialised misinformation moderators with enhanced tools and training, and teams on the ground who partner with experts to prioritise local context and nuance. We partner with 18 global fact-checking organisations, including Logically in the UK, who assess the accuracy of content in over 50 languages so that our moderators can apply our misinformation policies accordingly. We added three new global fact-checking partners in 2023, and will continue to expand our fact-checking program this year.
  • Deterring covert influence operations: We know that deceptive actors try to target online platforms during elections, and we remain vigilant against covert influence operations. We have dedicated experts working to detect, disrupt, and stay ahead of deceptive behaviours. We report the removals of covert influence networks in our quarterly Community Guidelines Enforcement Reports. In the coming months, we'll introduce dedicated covert influence operations reports to further increase transparency, accountability, and sharing with the industry. We provide information about how we assess this behaviour at our Transparency Centre.

We've consulted with more than 50 experts as our industry prepares for this historic election year, including our Safety and Content Advisory Councils. For more on how we combat misinformation, see our newly expanded Transparency Centre page.