The Infocomm Media Development Authority (IMDA) recently released the inaugural Online Safety Assessment Report on Designated Social Media Services (DSMSs)*. We're proud to be recognised as the industry leader for the work we do to keep TikTok safe, especially for teens.

Source: IMDA
Key findings from TikTok's Online Safety Report include:

You may also find TikTok's Online Safety Report here.
We keep our younger community members safe on the platform in several ways: only individuals 13 years and older are permitted to register for a TikTok account, and we ban accounts created by individuals below 13 years of age. We also provide users with the option to report accounts if they believe such accounts may belong to someone under the minimum age. For youths between the ages of 13 and 17, their accounts are equipped with age-appropriate settings, such as direct messaging and account visibility restrictions. More of these settings are outlined in the table below.

Every video posted on TikTok is initially reviewed by automated moderation technology. To support fair and consistent review of potentially violative content, human content moderators work alongside our automated moderation systems and take into account additional context and nuance which may not always be picked up by technology.
We use a combination of safety approaches to strike the right balance of creative expression and preventing harm:
- Remove content that we do not allow
Everyone who joins TikTok has the ability to freely share content on the platform. However, we remove content — whether posted publicly or privately — when we find that it violates our rules.
- Restrict content that is not suitable for youth
We allow a range of content on our platform, but also recognise that not all of it may be suitable for younger audiences. We restrict content that may not be suitable so that it is only viewed by adults (18 years and older). A summary of restricted content categories can be found here.
- Make ineligible for the For You Feed (FYF) content that does not meet our recommendation standards
The FYF is an opportunity to discover new content and reach new audiences, but it is not guaranteed that all content will be recommended. Content that does not meet our standards will be ineligible for the FYF. These standards can be found here.
- Empower our community with information, tools, and resources
We want to make sure users have the right information to manage their experience on TikTok and to that end, we may add labels, “opt-in” screens, or warnings to provide more context. Our safety toolkit can help to filter out content with specific hashtags or comments that our community are not comfortable seeing, and we also offer account controls and in-app features with safety resources. In Singapore, we've launched several in-app initiatives including the Wellness Hub, for users to easily access mental health resources created jointly with partners and creators to prioritise their wellbeing, and the Wellness Hub (Scam Prevention Edition), to drive scam awareness and prevention.
Other ways we protect our community include offering easy-to-use in-app and online reporting tools so they can flag any content or account they feel violates our Community Guidelines. The vast majority of removed content is identified proactively before it receives any views or is reported to us.
Partnering for success
We collaborate with a variety of experts and groups specialising in areas such as family safety, wellness, digital literacy, fact-checking, misinformation, and more. In Singapore, we partner with the Institute of Mental Health, the Samaritans of Singapore, TOUCH Mental Wellness, SG Her Empowerment (SHE), and others.
We also have regional Safety Advisory Councils, which bring together independent online safety experts such as Dr Natalie Pang, Head and Associate Professor at the Communications and New Media Department (CNM), and concurrently University Librarian at the National University of Singapore (NUS), to help us develop forward-looking Community Guidelines and features that not only address the challenges of today but also prepare us to face future industry issues.
In addition, our global Community Partner Channel provides selected organisations an additional route for reporting content that they believe breaks our Community Guidelines so that it can be reviewed by our teams. To date, more than 400 organisations who specialise in a range of safety issues use our Community Partner Channel, including several organisations in Singapore.
At TikTok, the safety and wellbeing of our community is of utmost importance and our commitment to safety has no finish line. We remain focused on creating a safe, welcoming, and enjoyable space for our community in Singapore through improvements in our product and policies, and deepening our partnerships with industry, regulators and civil society.
8 ways we protect our younger community members on TikTok here.
* This report was based on the first annual Online Safety Report that was submitted in 2024, for the time period 1 April 2023 to 31 March 2024.