by Arjun Narayan, Director - Trust & Safety, APAC, TikTok

In today's world, there's no question that digital platforms play an extremely influential role when it comes to inspiring creativity, enabling freedom of expression and building a strong community. While the internet provides us with many opportunities to freely exchange ideas and connect with others, the principal of freedom of expression is under intense scrutiny as platforms look to ensure they remain an inclusive and safe space for their users. This industry faces an increasing responsibility to ensure the right voices and content are being spread and heard, and it's not one that should fall on content platforms alone.

Leading digital platforms have begun creating advisory councils that consists of experts that can advise on policy and content moderation to develop forward-looking policies that not only address the challenges of today, but also help plan ahead for the next set of issues the industry will face. TikTok, for instance, has created an Asia Pacific Safety Advisory Council, alongside other leading legal, regulatory, and academic experts to provide subject matter expertise and advice on TikTok’s content moderation policies and practices to help shape regional and global guidelines. 

Policies Related to Free Speech and Censorship

While today's leading digital platforms all take a different approach to democratizing content, allowing it to be developed, shared and consumed more easily, not all online content is appropriate or safe. For this reason, platforms must establish clear community guidelines and create forward-looking policies that will mitigate the spread of harmful content. 

Most platforms agree that dangerous individuals and organizations should not be allowed to spread hateful ideologies or illegal activities, as well as violent and graphic content, content related to self-harm and dangerous acts, hate speech, harassment, sexually explicit, or misleading content. However, addressing these existing and emerging issues can be difficult as platforms are scrutinized for their moderation guidelines. 

To provide more transparency into how platforms are keeping users safe through moderation practices, platforms like TikTok have begun to develop Transparency Reports providing insight into how it responsibly responds to data requests and protects intellectual property. TikTok's Community Guidelines also provide general guidance on what is and what is not allowed on the platform. 

The Council's mission moving foward is to help outline TikTok's approach to policies to protect the safety of its community members across the APAC region, while maintaining full transparency to its users.

Policies Related to Online Safety

The most important commitment the industry faces is to keep its community members safe. This is a challenging but critically important area for the industry to get right, and platforms should look to approach the protection and safety of their users through policies, product, people, and partners.

From a policy perspective, platforms should be steadfast in their commitment to immediately remove content, terminate accounts, and report harmful cases to law enforcement as appropriate. They should also build strong safety controls, and invest heavily in human and machine-based moderation tools, as well as work with third parties to identify and remove hateful content accordingly.

Our Safety Advisory Council members' primary focus is to identify and provide guidance to resolve challenges related to online safety, child safety, digital literacy, mental health and human rights. They are a diverse group of experts comprising of backgrounds in IT, digital safety and literacy, intellectual property and internet law, and advocates of child safety, women and other marginalised groups, who are committed to addressing these challenges.

Community Effort to Make Digital Platforms a Safe Space for All

Policymakers, regulators, the platform and its users all have a stake in making digital platforms a safe space for all. Though we all come from different cultural and professional backgrounds, and may provide differing opinions on how to keep the community safe, we will work together to spot gaps in content moderation policies and provide advice on the best path forward.

The road ahead isn't going to be an easy one, per se, but it will be worthwhile as we collectively work together to tackle industry-wide issues. Our Council members have long been committed to serving the online community in their own individual capacities. Now, we're looking forward to uniting with them in this endeavor to make the internet more safe for users all across the Asia Pacific region, taking diverse cultural, religious, and other social nuances into account.