By Eric Han, Head of Safety, TikTok US
There has been significant interest and confusion regarding a user's two TikTok accounts and her viral video talking about the Uighur community in China. In this post we want to clarify the timeline of events, apologize for an error, and explain more about our moderation philosophy and the next steps our team will be taking in our continued commitment to our community.
First, a clarification on the timeline of events
November 14, 2019 @ 2:34pm ET – On a previous account (@getmefamousplzsir), a TikTok user posted a video that included the image of Osama bin Laden, resulting in an account ban in line with TikTok's policies against content that includes imagery related to terrorist figures. No China-related content was moderated on this account.
*While we recognize that this video may have been intended as satire, our policies on this front are currently strict. Any such content, when identified, is deemed a violation of our Community Guidelines and Terms of Service, resulting in a permanent ban of the account and associated devices.
November 14, 2019 @ 7:53pm ET – The user posted to a newly-created, second TikTok account (@getmefamouspartthree). As of the time of this blog post, that account had 19 videos in total.
November 23, 2019 @ 7:05pm ET – The user posted a video to that second account that talked about the Uighur community in China. As of the time of this blog post, the video has been viewed more than 1.5 million times on the app.
November 25, 2019 @ 3:32am ET – As part of a scheduled platform-wide enforcement, the TikTok moderation team banned 2,406 devices associated with accounts that had been banned for one of three types of violations: (1) Terrorism or terrorist imagery, (2) Child exploitation, (3) Spam or similar malicious content. Because the user's banned account (@getmefamousplzsir) was associated with the same device as her second account (@getmefamouspartthree), this had the effect of locking her out of being able to access her second, active account from that device. However, the account itself remained active and accessible, with its videos continuing to receive views.
November 27, 2019 @ 7:06am ET – Due to a human moderation error, the viral video from November 23 was removed. It's important to clarify that nothing in our Community Guidelines precludes content such as this video, and it should not have been removed.
November 27, 2019 @ 7:56am ET – The video went live again on the platform after a senior member of our moderation team identified the error and reinstated it immediately.
In total, the video was offline for 50 minutes.
An apology
We would like to apologize to the user for the error on our part this morning.
In addition, we are reaching out to the user directly to inform her that we've decided to override the device ban in this case. Our moderation approach of banning devices associated with a banned account is designed to protect against the spread of coordinated malicious behavior – and it's clear that this was not the intent here. This user can again access her active account (@getmefamouspartthree) from the device she was using previously.
Our moderation philosophy and commitment to our community
The work of preserving TikTok as a safe, positive, and welcoming environment for our users, while also protecting our users' freedom of creative expression to post content that may be serious or uncomfortable, is complex and challenging.
Like other internet platforms, we have invested enormous resources in technology that can act as a first line of defense against content that is clearly in violation of our Community Guidelines, such as displays of extreme violence, child exploitation, pornography, or spam. The second line of defense consists of human moderators operating on the basis of those Community Guidelines. Those individuals have an incredibly difficult task of reviewing many thousands of videos, and trying to make sound judgments where an answer is not always clear.
We acknowledge that at times, this process will not be perfect. Humans will sometimes make mistakes, such as the one made today in the case of @getmefamouspartthree’s video. When those mistakes happen, however, our commitment is to quickly address and fix them, undertake trainings or make changes to reduce the risk of the same mistakes being repeated, and fully own the responsibility for our errors.
To that end, we are reviewing both the procedural breakdown in this incident, as well as conducting a broader review on our process, to identify areas where we can improve our practices. We will also be reviewing our policies to allow carve-outs for things like education and satire, as other platforms do. To continue to provide transparency for our users, we will also be releasing our first transparency report as well a much fuller version of our Community Guidelines, both of which are on track to share with our community within the next two months.
We remain committed to working collaboratively with our community and stakeholders to achieve our common goal of providing a platform that fulfills its core purpose of bringing creativity and joy to its users.