Social media age checks must be accessible

A group of teenagers sitting on the floor. Between them lie several smartphones, a tablet, and an open laptop.

Share:

LinkedIn
Twitter
Facebook
Email

Children and young people under 16 will soon be unable to hold accounts on social media platforms like Facebook, Instagram, YouTube, and TikTok.

That’s because the Australian Government’s world-first social media age restriction legislation will officially kick in from 10 December.

From that date, social media companies will be required to introduce age restriction measures or face fines of up to $50 million for non-compliance.

What exactly is expected of them was outlined for the first time yesterday, when the eSafety Commissioner released new regulatory guidelines for impacted platforms.

What do the guidelines say?

The new guidelines set out that any age assurance systems used must be “reliable, accurate, robust, and effective”. Importantly, it is principles-based, meaning it doesn’t prescribe any specific technologies or methods.

Instead, each social media platform is encouraged to take a “layered approach”, combining systems, processes, and clear communication to meet its obligations.

“Our principles-based guidance recognises that there is no one-size-fits-all solution for industry, given the diversity of platforms and technology,” said eSafety Commissioner Julie Inman Grant.

Platforms will be expected to:

  • Remove existing underage accounts, with clear communication to users
  • Prevent re-registration by underage users
  • Offer review mechanisms for users who believe they’ve been wrongly flagged
  • Minimise reliance on self-declaration
  • Monitor and continuously improve systems over time

The guidance also makes clear what platforms do not have to do.

They don’t need to age-verify every user, cannot force anyone to use a government ID, and are not required to store personal data from individual checks.

Accessibility, inclusivity, and fairness

One of the most important parts of the guidelines, overlooked in media coverage to date, is Section 2.3.3 on accessibility, inclusivity, and fairness

Here, the guidelines clearly state: “Providers should ensure age assurance methods and surrounding systems and processes are accessible, inclusive, and fair.”

In other words, providers need to consider users “with diversity in appearance, abilities, and capacities” and use systems and safeguards that work for all. 

According to the guidelines, age checks in practice should:

  • Not unfairly block or exclude people
  • Offer different accessible options beyond ID documents
  • Be tested across diverse communities, including people with disability
  • Provide information in plain language, accessible formats, and multiple languages

At CYDA, we believe any age assurance measures adopted by social media platforms, whether ID-based or biometric, must be genuinely accessible. 

That means they need to be developed with and tested by people with disability, as well as digital accessibility specialists.

Disabled people are already excluded from jobs, training, and civic life on a daily basis because existing ID systems are inaccessible. Extending these flawed methods onto social media risks deepening digital exclusion.

But while the Australian Government’s guidelines set out key principles and obligations around accessibility, it will be social media companies themselves who decide how they meet them. 

It is therefore essential that each platform is adequately held to account for how it prioritises accessibility and inclusion.

CYDA’s position on the social media ban

For many children and young people with disability in Australia, social media provides an accessible way to connect with peers, participate in community or advocacy, and access vital health and wellbeing information. 

While CYDA previously opposed the ban, we are now focused on how the legislation is put into action. It is vital the rights of disabled children and young people are not compromised in the process.

The Australian Government and social media companies must ensure that under-16s receive adequate supports, transition periods, and inclusive digital literacy, and that age verification systems are accesible for all ages. 

Responsibility for creating safe and inclusive platforms for children and young people must ultimately rest with social media companies.

Further resources and opportunities

For more information about the social media ban and how it will be put into action, check out the following resources:

AYAC and PROJECT ROCKIT are also running several workshops on perparing for the social media ban. Register by clicking on the links below: