Octombrie 21, 2025Știri, Siguranță

How we protected the safety of our community during Moldova's parliamentary elections

Today, we are sharing insights about our work to protect the integrity of our platform during the recent Moldova parliamentary elections on September 28.

Countering deceptive behaviour

In the lead up to elections, we expect motivated actors who want to influence political outcomes to ramp up attempts to deceive online communities. In Moldova, media and civil society reported additional risks related to deceptive behaviours in part due to the geopolitical context and history of the country.

Our teams worked round the clock to proactively and aggressively counter attempts to deceive our systems and mislead the TikTok community. In the three months leading up to the election, we investigated more than 90 leads, both from internal and external sources.

This led to five networks being disrupted during August and September 2025, which consisted of at least 7,593 accounts, targeting the Moldovan audience. These networks were promoting pro-Russian politicians and attempted to discredit the current administration. In total, so far this year, we have disrupted eight covert influence operations targeting Moldova. We continue to publish new findings via our dedicated Covert Influence Operations Reports in our Transparency Center.

With every disruption, we learn more about how these actors are operating and changing their tactics to try to avoid detection. This helps us to act quickly against attempts to re-establish the same networks or similar ones. As an example, so far this year we have proactively removed more than 40,000 accounts globally associated with previously disrupted covert influence networks.

Protecting our community

Alongside advanced moderation technology, we have thousands of Trust and Safety professionals protecting our platform. Included in this number are specialised misinformation moderators with expertise in Moldova and who are given enhanced tools and training to detect and remove violative content.

During Moldova's elections, we also established a dedicated Mission Control Centre. This brought together professionals from multiple specialist teams within our trust and safety department. This collaboration helped us maximise the effectiveness of our work in the run-up to, and during, the elections themselves.

Here are some of our key results from July 1 to September 28:

  • We removed more than 9,300 pieces of content for breaking our rules on civic and election integrity, harmful misinformation, and AI-generated content. Of this content, 91% was removed proactively, before it was reported to us.
  • We proactively prevented more than 2.9 million fake likes and more than 1.8 million fake follow requests, and blocked more than 268,000 spam accounts from being created in Moldova.
  • We removed more than 134,000 fake accounts, more than 1.8 million fake likes, and more than 1.8 million fake followers between July 1 and September 28.
  • We removed 1,173 accounts impersonating public officials in Moldova.

Empowering our community

Another key part of our work focused on connecting people to trusted information and providing them with more context about the content they were viewing.

On August 21, we launched an in-app Election Centre in partnership with the Central Electoral Commission of the Republic of Moldova to provide our community with access to reliable information about the elections.

We actively directed users to the Centre through search banners and labels on election-related videos. To date, the Centre has been visited more than 88,000 times.

Partnering with experts

Collaborating with outside experts was another essential component of our strategy. As part of this, we work with more than 20 fact checking partners globally, including Reuters in Moldova, who support our work by flagging potential misinformation and helping us assess the accuracy of content. We also collaborated with local media literacy partner STOP FALS! to create educational videos featured in our Election Centre.

Our work to keep people on TikTok safe has no finish line, and we continue to evolve our policies and processes, while also taking feedback from outside experts, to ensure that our approach evolves to address emerging challenges and threats during elections.