Today we're sharing updates about our ongoing work to counter influence attempts on TikTok. We're expanding our policies for state-affiliated media and introducing a new transparency report focused on our work to eliminate covert influence operations. These updates advance our commitment to building a safe and secure platform that remains free from outside manipulation and influence. 


Countering influence operations during a historic election year

We relentlessly pursue and remove accounts that break our deceptive behavior rules to help ensure the content people see and the accounts they follow on TikTok are genuine. To identify these networks, we invest significant resources in running highly technical investigations that take into consideration open-source and proprietary information. 

To provide more regular and detailed updates about the covert influence operations we disrupt, we're introducing a new dedicated Transparency Report on covert influence operations. We're also adding new information about operations we have previously removed and that have attempted to return to our platform with new accounts. This will replace the quarterly disclosures we currently provide in our Community Guidelines Enforcement Reports. This more frequent and granular reporting will help increase transparency about our work to aggressively counter influence attempts.

In the first four months of 2024, we disrupted 15 influence operations and removed 3,001 associated accounts. We found that a majority of these networks were attempting to influence political discourse among their target audience, including in relation to elections. For example, we disrupted a network targeting an Indonesian audience ahead of the country's presidential elections earlier this year, and another targeting the UK by artificially amplifying narratives about the UK's domestic political discourse. 


Expanding our state-affiliated media policies

We believe that a core part of creating authentic experiences on TikTok is providing people with context about content they view. In the case of accounts belonging to state-affiliated media, we already work with independent external experts to assess these accounts so that we can appropriately label them along with their content, including ads, in line with our state-affiliated media policies. These labels bring important context about the source of the information to our community. 

Now, we're expanding our state-affiliated media policies to further address state-affiliated media accounts that attempt to reach communities outside their home country on current global events and affairs. When we identify these accounts, they will become ineligible for recommendation, which means their content won't appear in the For You feed. In addition, over the coming weeks, if these accounts advertise on our platform, they will not be allowed to advertise outside of the country with which they are primarily affiliated. This is an additional measure to prevent accounts from attempting to reach wider communities on these topics. These changes will apply to accounts in all markets where our state-controlled media labels are available. We are using a range of information to make this assessment, such as the account's name and the language in which the account is creating content.


Today's updates build on our long-standing efforts to foster authenticity on our platform. From enforcing robust policies against harmful misinformation to investing in media literacy for our community through in-app features and educational campaigns, we'll continue to aggressively protect our platform's integrity while empowering our community to create and enjoy authentic content and interactions on TikTok.