Published Date : 9/5/2025Â
The eSafety Commissioner in Australia has released documents summarizing the results of public consultations on how best to implement social media age restrictions. These documents also provide guidance to help companies assess if they are likely to be covered under the new restrictions.
The publication follows the release of the final report from the Australian Government’s Age Assurance Technology Trial. According to a post on eSafety’s LinkedIn account, this report will go “hand-in-hand” with ongoing formal engagement with firms leading up to December 10, when minimum age obligations commence.
“These resources will inform our compliance and enforcement activities, including the development of regulatory guidance to be released shortly,” the eSafety Commissioner stated.
Fairness, Flexibility, Interoperability, and Inclusivity Deemed Important
The consultation process engaged a diverse range of stakeholders, including online service providers, age assurance vendors, subject matter experts on the technical application of age assurance measures, international regulators and government representatives, and representatives of civil society, academia, and the education sector. Groups representing children, young people, parents, and carers based in Australia were given priority.
The consultation focused on eSafety’s implementation of the 2024 Online Safety Amendment (Social Media Minimum Age) Act, rather than on the contents of the legislation itself. The summary document lays out findings by group, and while there is plenty of information to digest, key themes recur.
The need for age assurance technology that prioritizes privacy and user consent is mentioned across groups. Systems should be robust, accurate, and fair. Flexibility and scalability are noted as key advantages, as is a multi-layered approach integrated throughout the user journey. Interoperability is key, as is inclusivity. There is potential in zero knowledge proofs (ZKP) and tokenization. Some experts and educators have reservations about biometrics, notably around privacy and the potential for bias. Privacy legislation and standards are seen as laying a baseline on which to base age assurance practices. Friction should be minimized.
The responses from eSafety’s National Online Safety Education Council and online safety educators show strong support for shifting greater responsibility to platforms rather than parents and carers. However, they also stress the importance of digital literacy.
Everyone knows there are workarounds, such as VPNs. But some participants suggested evaluation should consider whether age assurance is implemented in ways that uphold privacy, equity, and children’s rights, rather than the number of age checks blocking access to harmful content.
AVPA Identifies Three Options for Enforcement
The Age Verification Providers Association (AVPA) participated in the consultation, and a post on its LinkedIn page outlines the paths now available to eSafety Commissioner Julie Inman-Grant in how she enforces the law.
AVPA says Inman-Grant has three broad options, ranging from light to demanding. A light touch would leave it to platforms to use their existing marketing and data collection tools to flag and remove users who are likely to be underage.
The middle ground would “require specific age assurance processes when accounts are opened, but allow for a margin of error which would permit the most accurate estimation options to be relied upon,” acknowledging the need to set a buffer age for facial age estimation products.
What AVPA calls the “highly effective” approach is to “ensure the minimum age is applied exactly, so there is no access until the day you turn 16.” Again, in the case of estimation systems, that could mean setting a threshold of up to 21, to allow a buffer; the exact number, AVPA says, “will vary with the certified accuracy of the algorithm chosen.” Alternate methods, such as document checks, must be available for those who claim an error in estimation. “We also argue that platforms must offer age assurance of last resort – professional vouching to ensure inclusivity and accessibility.”
The association also notes the importance of independent testing and certification against international standards, and suggests the current moment is “a great opportunity to introduce an effective, interoperable, double-blind tokenized ecosystem so age checks can be performed once and then used across multiple sites seamlessly;” it points to euCONSENT as an example.Â
Q: What is the purpose of the age assurance technology trial in Australia?
A: The purpose of the age assurance technology trial in Australia is to test and evaluate different methods of ensuring that social media platforms are not accessed by users under the legal age limit, which is set to enhance online safety.
Q: What are the key themes that emerged from the public consultations?
A: The key themes that emerged from the public consultations include the need for fairness, flexibility, interoperability, inclusivity, and a focus on privacy and user consent in age assurance technology.
Q: What are the three enforcement options identified by the AVPA?
A: The three enforcement options identified by the AVPA are: a light touch approach, a middle ground approach requiring specific age assurance processes with a margin of error, and a highly effective approach ensuring exact age application.
Q: Why is digital literacy important in the context of age assurance?
A: Digital literacy is important in the context of age assurance because it helps users, especially children and young people, understand and navigate the online environment safely, and it shifts some responsibility from parents to platforms.
Q: What role do international standards play in age assurance practices?
A: International standards play a crucial role in age assurance practices by providing a baseline for privacy, accuracy, and fairness, which helps ensure that age verification systems are robust and reliable.Â