By Vanessa Pappas, General Manager, TikTok US

At TikTok, everything we do is rooted in our desire to create a rewarding experience for our users. Our diverse, expressive community makes spending time on TikTok feel lighthearted, real, heartwarming, and truly fun. TikTok is a place where everyone's creative side is welcome. As a company, we're motivated by a deep sense of responsibility to this community – the backbone of our platform – just as we're driven to build a product deserving of their creativity. This is summed up in our mission: to inspire creativity and bring joy.

Mindful of our responsibility to our users, last year we shared our intention to form an external council of leading experts to advise TikTok on content moderation policies covering a wide range of topics. We've made great progress as a company since we took the first steps of partnering with global law firm K&L Gates LLP and their team, including former Congressmen Bart Gordon and Jeff Denham. Earlier this year we updated our Community Guidelines to provide clarity to our users, and last week we announced our Transparency Center that will provide content and coding insight to outside experts. 

Today, we're pleased to introduce the important gathering of technology and safety experts who will help shape our policies as the founding members of TikTok's Content Advisory Council. This Council brings together thought leaders who can help us develop forward-looking policies that not only address the challenges of today, but also plan ahead for the next set of issues that our industry will face.

Commenting on the TikTok initiative, former Rep. Bart Gordon (D-Tenn.) said: "We are proud of the work TikTok's US leadership has put in, alongside our team, to form a group of experts with a diverse array of experiences and perspectives."

In praise of the initial group of experts, former Rep. Jeff Denham (R-Calif.) stated: "We firmly believe that this Council is more than capable of providing candid and productive advice as TikTok continues to strengthen its content moderation policies."

The Council members we've assembled represent a diverse array of backgrounds and perspectives, and spent much of their lives researching, studying and analyzing issues relevant to TikTok and the space we operate in, such as child safety, hate speech, misinformation, and bullying. We will call upon our Council to provide unvarnished views on and advice around TikTok’s policies and practices as we continually work to improve in the challenging area of content moderation. That’s why we're excited to have members who represent legal, regulatory, and academic expertise, as well as the needs and perspectives of our diverse community.

The initial Council members draw from the technology, policy, and health and wellness industries and will be led by Dawn Nunziato as chair. A professor at George Washington University Law School and co-director of the Global Internet Freedom Project, Dawn specializes in issues involving free speech and content regulation, and brings a wealth of knowledge to help steer the Council on the key issues affecting both TikTok and the broader technology landscape. 

In joining the TikTok Advisory Council, Dawn Nunziato, George Washington University Law School, said: "A company willing to open its doors to outside experts to help shape upcoming policy shows organizational maturity and humility. I am working with TikTok because they've shown that they take content moderation seriously, are open to feedback, and understand the importance of this area both for their community and for the future of healthy public discourse.

We look forward to surrounding ourselves with industry leaders in their fields as we continue to strengthen our content moderation policies and platform practices, and we are grateful for their advice and contribution to making TikTok a place where joy and creativity can thrive. Filling out the initial members of TikTok's Content Advisory Council, which will grow to a group of around a dozen experts, are:

  • Rob Atkinson, Information Technology and Innovation Foundation, brings academic, private sector, and government experience as well as knowledge of technology policy that can advise our approach to innovation
  • Hany Farid, University of California, Berkeley Electrical Engineering & Computer Sciences and  School of Information, is a renowned expert on digital image and video forensics, computer vision, deep fakes, and robust hashing
  • Mary Anne Franks, University of Miami Law School, focuses on the intersection of law and technology and will provide valuable insight into industry challenges including discrimination, safety, and online identity
  • Vicki Harrison, Stanford Psychiatry Center for Youth Mental Health and Wellbeing, is a social worker at the intersection of social media and mental health who understands child safety issues and holistic youth needs
  • Dawn Nunziato, chair, George Washington University Law School, is an internationally recognized expert in free speech and content regulation
  • David Ryan Polgar, All Tech Is Human, is a leading voice in tech ethics, digital citizenship, and navigating the complex challenge of aligning societal interests with technological priorities
  • Dan Schnur, USC Annenberg Center on Communication and UC Berkeley Institute of Governmental Studies, brings valuable experience and insight on political communications and voter information

The Council will meet alongside our US leaders to discuss areas of importance to the company and our users, with our first Content Advisory Council meeting at the end of this month focusing on critical topics around platform integrity, including policies against misinformation and election interference. Ensuring integrity and promoting authenticity around civic engagement are important challenges for content platforms, and we are cognizant of the need to build responsibly in this area. In addition to the many protections we've already enacted – including banning political ads, working to detect and remove deepfakes, enabling users to report misleading information, and partnering with third-party fact-checking and media literacy organizations – we want to surround ourselves with experts who can both evaluate the actions we've taken and provide guidance on additional measures we could be pursuing. 

All of our actions, including the creation of this Council, help advance our focus on creating an entertaining, genuine experience for our community by staying true to why users uniquely love the TikTok platform. As our company grows, we are focused on reflection and learning as a part of company culture and committed to transparently sharing our progress with our users and stakeholders. Our hope is that through thought-provoking conversations and candid feedback, we will find productive ways to support platform integrity, counter potential misuse, and protect the interests of all those who use our platform.