By Cormac Keenan, Head of Trust and Safety, TikTok
People all over the world come to TikTok to express themselves creatively and be entertained. To maintain the trust of our community, we aim to be transparent about the actions we take surrounding their content. With that in mind, today we're releasing our Q1 Community Guidelines Enforcement Report as we continue to bring transparency to the critical work of moderating content in order to keep TikTok a safe and welcoming place for our community.
Since 2019 we've released Transparency Reports, increasing the information we provide with each report. This includes adding insights related to the actions we take to protect the safety and integrity of our platform, such as reporting the nature of content removed for violating our policies and the number of violative ads rejected. We've heard from our community, civil society organizations, and policymakers that this information is helpful to understanding how TikTok operates and moderates content. To share these insights more frequently, we're now publishing quarterly Community Guidelines Enforcement Reports as part of our broader transparency reporting efforts. Because of the nature of processing and responding to legal requests, we'll continue to publish those data bi-annually.
Here are some of the key insights from our Q1 2021 report, which you can read in full here.
- 61,951,327 videos were removed for violating our Community Guidelines or Terms of Service, which is less than 1% of all videos uploaded on TikTok.
- 82% of these videos were removed before they received any views, 91% before any user reports, and 93% within 24 hours of being posted.
- 1,921,900 ads were rejected for violating advertising policies and guidelines.
- 11,149,514 accounts were removed for violating our Community Guidelines or Terms of Service, of which 7,263,952 were removed for potentially belonging to a person under the age of 13. This is less than 1% of all accounts on TikTok.
- 71,470,161 accounts were blocked from being created through automated means
As we’ve evolved our report, we’ve aimed to help the industry push forward when it comes to transparency and accountability around user safety. To bring more visibility to the actions we take to protect minors, in this report we added the number of accounts removed for potentially belonging to an underage person. This builds upon our previous work to strengthen default privacy settings for teens, offer tools to empower parents and families, and limit features like direct messaging and livestream to those age 16 and over. In order to continue strengthening our approach to keeping TikTok a place for people 13 and over, we aim to explore new technologies to help with the industry-wide challenge of age assurance.
This report also included our first security overview on our global bug bounty program which helps us proactively identify and resolve security vulnerabilities. This program strengthens our overall security maturity by encouraging global security researchers to identify and responsibly disclose bugs to our teams so we can resolve them before attackers exploit them. As we report, in the first quarter of 2021, TikTok received 33 valid submissions and resolved 29 of them. We also received and published 8 public disclosure requests. With regards to response efficiency, TikTok has an average first response time of 8 hours, resolution time of 30 days, and it takes us 3 days on average to pay out a bounty.
In the future, we'll publish this data at our online transparency center which we're working to overhaul to be a home for our transparency reporting and other information about our efforts to protect the safety and integrity of our platform.