As an official partner of the UEFA EURO 2020 tournament, we could not be more excited for that first whistle on the 11 June to kick off an amazing summer of sport. As TikTok fast becomes a home for football culture and fans, we cannot wait to watch our community create their own TikTok football moments, reactions and celebrations during the UEFA EURO 2020 tournament.
Like all fans, we're also all too aware that racism and other hate speech in football is still far from being eradicated. At TikTok, we are determined to protect the experience of enjoying football on our platform from those who seek to spread hate and division.
To show our commitment, millions of people across Europe will see our firm position against hate of any kind when they open TikTok for the first time today, as we launch our #SwipeOutHate campaign, encouraging our community to stand together against hate in football and to make the most of our TikTok safety tools. In the coming weeks, we'll also provide educational safety content and in-app Effects, and make further commitments as part of our partnership with UEFA.
Nothing matters more to us than keeping our whole community safe. Here's a look at some of the work we do to foster an inclusive environment that enables everyone's creative expression to flourish on our platform.
Tackling hate speech
Hate has no place on TikTok. Our Community Guidelines make clear that we do not tolerate hate speech, bullying or harassment. We use a combination of technologies and moderation teams to detect and review such content and behaviours, with potential actions including removing videos and comments, and banning accounts.
Hate speech is complex and ever-evolving, and we don't pretend that we have always got it right. It's why our teams are always looking for ways in which we can be better. For example, last October, we strengthened our enforcement against hate speech to help ensure we capture the evolving landscape, language and terminology of hateful behaviours.
It's also why we invest in regular training for our moderation teams to better detect hateful behaviour, symbols, terms, and offensive stereotypes. That training also helps ensure that our teams can properly identify and protect counter speech on TikTok, as language previously used to exclude and demean groups of people is reclaimed by those very same communities.
Partners are critical to our progress. We consult academics and experts from across the globe to keep abreast of evolving trends and to help us regularly evaluate and improve our policies and enforcement processes. We're particularly proud to work with community partners like Galop, Glitch, Stonewall and TellMama.
Empowering our community
While we continue to invest in cutting-edge technologies and industry-leading teams to counter hate on TikTok, we also want people to feel in control of their TikTok experience and help us foster a supportive and positive environment. Here are just a few of our TikTok safety tools:
- No unsolicited direct messages: Only friends - users who follow each other - can send direct messages to one another. Direct messaging is restricted to those aged 16 and over only, and we don't allow images or videos to be shared via DM.
- In control of comments: Our community can restrict who comments on their videos to no one, just friends or everyone (for those aged under 16, we've removed the everyone setting). Users can choose to filter all comments or those those with the specific keywords they have chosen. By default, spam and offensive comments are hidden from users when we detect them.
- Promote kindness: A prompt asks people to reconsider posting a comment that may be inappropriate or unkind and reminds users to review our Community Guidelines.
- Report inappropriate content and behaviour: Reporting is fast, easy and confidential on TikTok. If a user sees something - whether it's a video, a comment, a direct message or an account - they don't think should be on TikTok, they can use the in-app reporting button to let us know. We will review against our Community Guidelines and take appropriate action.
- Decide who can see and interact with your content: On TikTok, people can choose whether no one, just friends or everyone can view their content, as well as who, if anyone, can Duet or Stitch with their content (for those aged under 16, Duet and Stitch are restricted to friends only). These settings can be applied to the account as a whole or to individual videos, and they can be adjusted even after posting. Users aged under 16 have their accounts set to private by default. In addition, people can block users to prevent them from viewing their content or engaging with them via direct messages, comments, follows or likes.
Information on all the settings and features available for our community can be found in the TikTok Safety Centre.
Creating the best LIVE experience
Only users aged 16 and over can go LIVE on TikTok. As well as using technology and moderation team to help ensure LIVEs do not violate our Community Guidelines or Terms of Service, we also give creators a number of features to help them create the best LIVE experience. When going LIVE, creators can:
- Nominate someone they know to help moderate their live-stream.
- Use our comment filter feature technology to automatically prevent certain keywords from appearing in the comment section.
- Block and mute viewers during the live stream.
- In addition to the above, LIVE viewers can report other viewers via the comment section or the host by clicking on the host image in the top left corner and then selecting report from the pop up.
When it comes to safety, there is no final whistle. We know there is always more we can and must do to improve our policies, processes and products to help keep TikTok a safe home for everyone, no matter who they are. Our goal is to eliminate hate, and we're committed to that goal for as long as it takes.