Published Date : 9/9/2025Â
Australia is entering uncharted regulatory waters as it seeks to implement world-first social media age restrictions. Age assurance systems, including biometric age estimation, facial age analysis, identity document verification, and parental control measures, are an effective way to protect young people from age-inappropriate content online, according to the Australian government’s final report of the independent Age Assurance Technology Trial (AATT).
Communications Minister Anika Wells promised that the Albanese Government will push forward in its plans to introduce age restrictions for accessing social media and other platforms with the publication of the report. The eSafety Commissioner has said the age restrictions are “likely” to apply to Facebook, Instagram, Snapchat, TikTok, X (formerly Twitter), and YouTube, among other platforms. Additionally, it has made more specific reference to when the age restrictions will apply.
X Executive Chairman and CTO Elon Musk complained last year that age checks for social media are “a backdoor way to control access to the Internet for all Australians.” YouTube objected to its inclusion in the category, arguing that it is more like television than social media, and even drafted in the Wiggles to make the case. But the government believes it fits the criteria.
Generally, the conditions for age restriction will apply to social media platforms that meet three criteria: the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users; the service allows end-users to link to, or interact with, some or all of the other end-users; the service allows end-users to post material on the service.
Under legislative rules set out by the Minister for Communications in July, online gaming and standalone messaging apps are among the services excluded from these conditions. The legal text on that is available here. However, messaging platforms that incorporate social media-like functions, like enabling user interactions beyond simple messaging, could fall under the scope of age restrictions, which also applies to messaging tools embedded within social media accounts that are themselves subject to age restrictions.
The Commissioner has stressed that it is not a social media ban but a delay to having accounts, with no penalties faced by under-16s who access an age-restricted social media platform, or for their parents or guardians. But the social media platforms may face penalties if they don’t take “reasonable steps” to prevent under-16s from having accounts. Such penalties include court-imposed fines of up to 150,000 penalty units for corporations, which currently amounts to a total of 49.5 million Australian dollars (US$32.6 million).
The eSafety Commissioner says the restrictions aim to protect young Australians from “pressures and risks” that users can be exposed to while logged in to social media accounts, claiming they come from design features that encourage spending more time on screens, while showing content that can harm their health and wellbeing.
Australia has set December 10 as the date when minimum age obligations commence, with the results of public consultations released. The consultation focused on the eSafety Commissioner’s implementation of the 2024 Online Safety Amendment (Social Media Minimum Age) Act, rather than on the contents of the legislation itself. The summary document’s key themes touch on the need for age assurance tech that prioritizes privacy and user consent. Systems should be robust, accurate, and fair. Flexibility and scalability are noted as key advantages, as is a multi-layered approach integrated throughout the user journey. There is potential in zero knowledge proofs (ZKP) and tokenization. Some experts and educators have reservations about biometrics, notably around privacy and the potential for bias. Privacy legislation and standards are seen as laying a baseline on which to base age assurance practice.
The responses from eSafety’s National Online Safety Education Council and online safety educators show “strong support for shifting greater responsibility to platforms rather than parents and carers.”Â
Q: What are the main criteria for social media platforms to fall under the age restriction?
A: The main criteria for social media platforms to fall under the age restriction are: the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users; the service allows end-users to link to, or interact with, some or all of the other end-users; the service allows end-users to post material on the service.
Q: What are the penalties for social media platforms that do not comply with the age restrictions?
A: Social media platforms that do not take 'reasonable steps' to prevent under-16s from having accounts may face court-imposed fines of up to 150,000 penalty units for corporations, which currently amounts to a total of 49.5 million Australian dollars (US$32.6 million).
Q: Are online gaming and standalone messaging apps excluded from the age restrictions?
A: Yes, online gaming and standalone messaging apps are excluded from the age restrictions. However, messaging platforms that incorporate social media-like functions, like enabling user interactions beyond simple messaging, could fall under the scope of age restrictions.
Q: What is the main goal of these age restrictions?
A: The main goal of these age restrictions is to protect young Australians from 'pressures and risks' that users can be exposed to while logged in to social media accounts, including design features that encourage spending more time on screens and showing content that can harm their health and wellbeing.
Q: When will the minimum age obligations commence in Australia?
A: The minimum age obligations will commence on December 10, as set by the Australian government.Â