TikTok stands against terrorism. We are shocked and appalled by the horrific acts of terror in Israel last week. We are also deeply saddened by the intensifying humanitarian crisis unfolding in Gaza. Our hearts break for everyone who has been affected.

We immediately mobilized significant resources and personnel to help maintain the safety of our community and integrity of our platform. We're committed to transparency as we work to provide a safe and secure space for our global community. We remain focused on supporting free expression, upholding our commitment to human rights, and protecting our platform during the Israel-Hamas war.

Upholding TikTok's Community Guidelines

As part of our crisis management process, our actions to safeguard our community include:

  • Launching a command center that brings together key members of our 40,000-strong global team of safety professionals, representing a range of expertise and regional perspectives, so that we remain agile in how we take action to respond to this fast-evolving crisis.
  • Evolving our proactive automated detection systems in real-time as we identify new threats; this enables us to automatically detect and remove graphic and violent content so that neither our moderators nor our community members are exposed to it.
  • Adding more moderators who speak Arabic and Hebrew to review content related to these events. As we continue to focus on moderator care, we're deploying additional well-being resources for frontline moderators through this time.
  • Continuing to enforce our policies against violence, hate, and harmful misinformation by taking action to remove violative content and accounts. For example, we remove content that supports the attacks or mocks victims affected by the violence. If content is posted depicting a person who has been taken hostage, we will do everything we can to protect their dignity and remove content that breaks our rules. We do not tolerate attempts to incite violence or spread hateful ideologies. We have a zero-tolerance policy for content praising violent and hateful organizations and individuals, and those organizations and individuals aren't allowed on our platform. We also block hashtags that promote violence or otherwise break our rules.
  • Adding opt-in screens over content that could be shocking or graphic to help prevent people from unexpectedly viewing it as we continue to make public interest exceptions for some content. We recognize that some content that may otherwise break our rules can be in the public interest, and we allow this content to remain on the platform for documentary, educational, and counterspeech purposes.
  • Making temporary adjustments to policies that govern TikTok features in an effort to proactively prevent them from being used for hateful or violent behavior in the region. For example, we're adding additional restrictions on LIVE eligibility as a temporary measure given the heightened safety risk in the context of the current hostage situation.
  • Cooperating with law enforcement agencies globally in line with our Law Enforcement Guidelines which are informed by legal and human rights standards. We are acutely aware of the specific and imminent risks to human life involved in the kidnapping of hostages and are working with law enforcement to ensure the safety of the victims in accordance with our emergency procedures.
  • Engaging with experts across the industry and civil society, such as Tech Against Terrorism and our Advisory Councils, to further safeguard and secure our platform during these difficult times.

Since the brutal attack on October 7, we've continued working diligently to remove content that violates our guidelines. To-date, we've removed over 500,000 videos and closed 8,000 livestreams in the impacted region for violating our guidelines.

Preventing the spread of misleading content

Misinformation during times of crisis can make matters worse. That's why we work to identify and remove harmful misinformation. We also remove synthetic media that has been edited, spliced, or combined in a way that could mislead our community about real-world events.

To help us enforce these policies accurately, we work with IFCN-accredited fact-checking organizations who support over 50 languages, including Arabic and Hebrew. Fact-checkers assess content enabling our moderators to accurately apply our misinformation policies. Out of an abundance of caution, while a video is being fact-checked, we make it ineligible for the For You feed. If fact-checking is inconclusive, we label the content as unverified, don't allow it in For You feeds, and prompt people to reconsider before sharing it.

We continue to proactively look for signs of deceptive behavior on our platform. This includes monitoring for behavior that would indicate a covert influence operation which we would disrupt and ban accounts identified as part of the network.

We will soon be rolling out reminders in Search for certain keywords in Hebrew, Arabic, and English to encourage our community to be aware of potential misinformation, consult authoritative sources, and to remind them of our in-app well-being resources if they need them.

Shaping your TikTok experience

We have a large suite of existing controls and features that we encourage everyone in our community to consider using as they tailor the TikTok experience that best suits their preferences. These include:

  • For You feed controls: People can tap 'Not Interested' on content they want to see less of or choose 'Refresh' if they want to restart their feed. When Restricted Mode is enabled, it helps to limit the appearance of content that may not be appropriate for a general audience and filters content with warning labels.
  • Comment controls: People can choose who can comment on their videos, filter keywords from comments, or review comments before they are published. They can also block, delete and report comments in bulk. We prompt people to reconsider posting unkind comments, too.
  • Screen time controls: We offer a range of tools to help people customize and control their time on our app, such as tools to set a screen time limit, reminders to take a break or log off for bedtime, and more.
  • Family Pairing tools: Through our Family Pairing feature, parents and guardians can link their TikTok account to their teen's account to enable a variety of content settings. For example, they can choose to turn off search, enable Restricted Mode, and customize screen time settings.
  • Reporting: Anyone can report content on TikTok, including comments, accounts and livestreams. Long press on a video, tap report, and select a reason, such as misinformation.

We also provide our community with helpful resources on a number of issues, including how to recognize hate and report it, and how to safely share stories about their mental health and access help if they need it.

We will continue to adapt our safeguards to protect our community.