The safety of our community, especially our younger users, is a top priority for TikTok, and something we take incredibly seriously. This year, we expect to invest more than $2 billion USD in our trust and safety work globally. This investment supports tens of thousands of trust and safety experts who work around the clock, across the globe, to keep our community safe. As part of this work, between April and June 2024 alone, we removed more than 20 million suspected underage accounts globally.
At TikTok, we know there is no finish line when it comes to improving safety for our community. We will continue to invest in our people and systems. For the Australian TikTok community to thrive, they need to have a safe and authentic digital experience. That experience looks different for users of different ages, including through considered product design. As well as proactively moderating content at-scale, we empower families and young people with the tools they need to manage their own experience every step of the way. We encourage all parents and caregivers to look at our Guardian's Guide for more information. TikTok's recent submission to the Joint Select Committee on Social Media and Australian Society also considered related safety matters. Of note for this Committee are our extensive, industry-leading protections and policies that are specifically designed to keep our younger users safe with age-appropriate experiences.
The Online Safety Amendment (Social Media Minimum Age) Bill 2024:
We have significant concerns with the process that has culminated in the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (the Bill). Unfortunately, this Committee has not been given sufficient scope to address the many complex questions associated with this piece of legislation, let alone broader matters related to protecting children online.
Even through our time-limited review, we see a range of serious, unresolved problems that the Government must clarify to assure the Parliament that the Bill will not cause unintended consequences for all Australians. In particular, we encourage the Parliament to undertake proper and detailed engagement with experts, platforms, mental health organisations, young people and their families. Where novel policy is put forward, it's important that legislation is drafted in a thorough and considered way, to ensure it is able to achieve its stated intention. This has not been the case with respect to this Bill.
Issues with the Bill’s definitions
This Bill has an important, but broad and unclear, definition that impacts the type of businesses and services that could potentially be included. As it is drafted, almost every online service could fall within the definition of 'age restricted social media platform ', including fitness apps and music streaming services. This lack of clarity effectively scopes in thousands of businesses and digital services that Australians rely on every day.
Questions the Government must provide clarity on include:
- In section 63C does "end -users" include only those with registered accounts, or does it also include those without registered accounts? If a platform did not allow for any users to have registered accounts, would that fall under the definition of "age-restricted social media platform"?
- In section 63C(1)(a)(i):What is meant by "enable online social interaction between 2 or more end-users ", other than expressly including the sharing of material for social purposes and expressly excluding business purposes?What is meant by "significant purpose "? This phrase is not defined or used elsewhere in the Online Safety Act, or Industry Codes or Standards.
- In section 63C(1)(a)(ii) what is meant by "allows end-users to link to some or all of the other end-users "? Does "link" equate to an individual subscribing to, or following, another person's account – or simply being able to contact another individual – or something else?
- In section 63C(1)(b):Is the intention here that the Minister would be able to expressly designate, and single out, specific platforms under the legislative rules, similar to the rules under the News Media Bargaining Code? Or will the service unilaterally decide whether the legislative rules capture them or not?Is it envisaged that the Minister would have the ability to specify a service under a legislative rule that doesn't fall under the conditions set out in section 63C(1)(a), for instance a service whose sole or significant purpose is not enabling online social interaction between 2 or more end-users?
- In section 63C(2) what is meant by "social purposes"?
- In section 63C(4) what is the nature of the harm being referred to in the context of the Minister being able to make legislative rules?
The Bill's definitions need considerable work to ensure they are clear, enforceable and applied fairly and as expressly intended by the legislation.
The Bill creates a 'license to be online', and hinges its enforcement on an age assurance trial
There are extensive references in the Explanatory Memorandum to the outcome of the Government's age assurance trial, including that "it will be instructive for regulated entities, and will form the basis of regulatory guidance issued by the Commissioner, in the first instance ". At Budget Estimates, on 5 November 2024, the following exchange gives the Committee insight into how the age verification trial will work:
- Senator Shoebridge: If you are testing to see if someone is 13, 14, 15 or 16, you are also testing to see, by definition, if they're 16-plus. If there is going to be age verification, everybody will have to go through an age verification process, won't they?.
- Mr Chisholm: Yes.
- Senator Shoebridge: So this isn't just about privacy or collecting data about kids.
- This is literally everybody accessing social media. That's how it has to work, isn't it?.
- Mr Irwin: Yes.
As the Government's admissions in Budget Estimates make clear, age-restricted social media platforms will need to undertake age assurance for each and every Australian user in order to remove age-restricted users from their services. This effectively creates a mechanism whereby Australians need a 'licence to be online'.
It appears that this Bill hinges on an uncompleted trial, the outcome of which will likely require all Australians to be vetted through a yet-to-be-determined age assurance system. There are many questions about the trial itself, let alone its possible outcomes and conclusions, which have yet to be answered. Given the impact of mandating that all Australians who wish to use social media platforms be subject to such an age assurance system, we urge Parliament to consider the broader implications of legislating such an outcome without knowing any details of the system itself.
This also raises one of many significant, outstanding questions that impacts the privacy of Australians online.
'Privacy' provisions could undermine both privacy and safety
The Bill's stated intention is to improve safety. However, there are a range of inconsistencies in this Bill, including in relation to privacy, some of which conflict with its purported safety objectives. Certain other proposed changes in the Bill seek to cover ground already addressed in the Privacy Act, creating confusion in how conflicting provisions are intended to operate. The Government should provide the Committee with guidance on these issues, and some examples of these tensions within the Bill are outlined below.
- Consent Mechanism Scope: The consent mechanism in section 63F of the Bill is ill- considered and fails to account for broader uses of some types of personal information beyond age assurance (e.g. using an individual's age to ensure they have an appropriate, relevant and safe experience, which is currently permitted by the Privacy Act provided certain conditions are met). Many platforms collect age information not only for the purposes of ensuring someone is old enough to be on a platform, but also because that information can be materially relevant to ensuring that a user has a wider, relevant, and age appropriate experience in respect of particular features, functions and content. Removing a platform's ability to use and retain information about age for other reasons could make users less safe by removing platforms' ability to use age information for other legitimate safety purposes.
- Conflict with current Privacy Law: It's possible that some of the stricter requirements for consent (voluntary, informed, specific, non-ambiguous) and data deletion in the Bill conflict with existing Privacy Act provisions. This could lead to complications if future amendments to the Privacy Act are passed. We note in particular that some of these proposed amendments have only been accepted in principle by the Government, but are being legislated into the Online Safety Act regardless.
- Conflict with Australian Privacy Principle 9: It's important to understand how the restriction on further use of personal information "that was collected for the purpose of, or for purposes including the purpose of, taking reasonable steps to prevent age- restricted users having accounts with an age-restricted social media platform " is intended to operate with APP 9.2, which specifically regulates the use of government- related identifiers. If such "personal information" includes a government-related identifier, then is the restriction in section 63(1)(a) intended to supersede the rules set out in APP 9.2?
With respect to section 63F(3)(a), it is unclear from the drafting of this deletion requirement whether platforms would be permitted to retain information about the individual's age itself (e.g. to retain the information that the user associated with an account/email address/phone number is 21 years old), or whether platforms are required to collect an individual's age each and every time they use the platform. In practice, this seems at odds with privacy principles and efficient data collection. Some examples are provided below:
- An individual tries to register a platform account with their email address and provides a date of birth that indicates the user is under 16, and is therefore not permitted to open a platform account. Under the current drafting of section 63F, the platform must delete the information about that individual's self-declared age as it's no longer required given the platform has used it to ascertain the user's age. When the individual tries again to use that same email address but provides a different (older) date of birth, the platform does not have the historical information available to question the accuracy of the information being provided by this individual.
- An individual registers a platform account and provides their age information which details that they are 21 years old. After using the age information to establish that the individual may open an account, the platform deletes their age information. The same individual tries to access some of the features of the platform which are age-restricted (according to the platform's safety rules) - for example, hosting a livestream where the platform only allows over 18s to host livestreams. The individual then has to provide their age information again, because the platform has been unable to hold and/ or maintain the individual's age details.
The above are just some of the challenges this Bill presents, noting the short timeframe provided to scrutinise the legislation. However, it is clear that the Bill will impact all Australians and that its rushed passage poses a serious risk of further unintended consequences. As countless online safety experts and mental health organisations have highlighted, there remain many unanswered questions and unresolved concerns regarding this legislation, with its poorly drafted and unworkable definitions, unclear privacy safeguards, and its dependence on the Government's yet-to-be-completed age assurance trial ranking highest among them.
We urge the Committee and the Parliament to treat this legislation as it would any other significant reform proposal, and to take the time to listen to the experts, many of whom have clearly and repeatedly expressed their concerns with this legislation.