Australia’s social media ban for under-16s
16
September
2025
1
min read

Australia’s social media ban for under-16s will take effect in December 2025. Here, we break down how the policy will work, and summarise the regulatory guidance released by the eSafety Commissioner.
Key Facts
- The social media ban for under-16s will come into effect on 10 December 2025.
- Platforms must take “reasonable steps” to remove underage accounts; fines of up to $49.5 million for breaches will apply.
- Polling in late 2024 showed 77 per cent of Australians support the ban.
Policy Landscape
How did this policy come about?
In May 2024, the Federal Government convened a Joint Parliamentary Select Committee to examine social media’s effects. The Committee’s final report, handed down in November 2024, illuminated the dangers of social media for young Australians. It provided 12 recommendations, but did not include any recommendation for an age ban.
However, public sentiment, fuelled by high-profile media and advocacy campaigns, pushed the issue to the centre of political debate. These campaigns linked teenage mental health crises to social media exposure and garnered over 127,000 petition signatures.
On 29 November 2024, with bi-partisan support, the Australian Government amended the Online Safety Act to ban children younger than 16 years of age from having social media accounts.
Polling at the time showed that this was a widely popular policy, with 77 per cent of Australians supporting it.
Breaking Down the Ban
Who is affected?
Children under 16 who are ordinarily residents in Australia will be banned from holding accounts on “age-restricted social media platforms”.
What are “age-restricted social media platforms”?
Platforms which significant purpose is online social interaction between two or more end-users. This means platforms including Tiktok, Snapchat, X, Instagram, Facebook and YouTube will be required to take “reasonable steps” to prevent children younger than 16 from creating or holding an account.
What are “reasonable steps”? How will platforms enforce the ban?
Platforms must use age assurance methods (not just self-declaration) to identify underage users. This might include age estimation through facial analysis, age inference through behaviour and activity patterns, or age verification by checking date of birth against trusted sources.
If an underage account is removed, platforms must prevent the user from immediately creating a new account (e.g. by blocking their device, email, or phone number).
Also, platforms must not market to under-16s or make their services easily discoverable by them.
Compliance
The eSafety Commission will monitor and enforce compliance.
Platforms must keep records of their processes and be able to demonstrate they took reasonable steps.
Penalties for non-compliance can be up to $49.5 million for social media platforms.

Challenges
1. Vagueness
The social media ban hinges on an undefined measure of “reasonable steps”.
The e-Safety Commissioner’s Social Media Minimum Age - Regulatory Guidance highlights that: “there is no one-size fits all approach for what constitutes the taking of reasonable steps… providers are required to make their own determination of what steps to take, and, if asked, to demonstrate to eSafety that those steps were reasonable in the circumstances”.
This flexibility may help accommodate different platforms, but it also creates ambiguity.
Companies do not know the minimum baseline for compliance and could take divergent approaches, leading to uneven enforcement and possible legal disputes.
The lack of a prescriptive standard also makes it easier for platforms to do the bare minimum, potentially without delivering substantive cultural change.
This vagueness also gives the eSafety Commissioner considerable discretion. The regulator may be forced to make judgement calls on whether a company’s approach amounts to compliance. That, in turn, creates a risk that companies will focus more on paperwork than performance; producing detailed compliance reports to prove they have processes in place, rather than ensuring those processes actually work.
2. How do you measure success?
Another major challenge lies in the difficulty of measuring whether the law is actually working.
It is unknown how many children will access social media through a VPN (a virtual private network), for example, or use an older person’s account. Young people are already adept at bypassing online restrictions; where there’s a will there’s a way, so the official number of underage accounts removed may say little about the actual level of underage social media use.
This raises broader questions about how success will be defined. The Government has framed the ban as a way of improving children’s mental health, but it will be extremely difficult to establish a causal link between the ban and improvements in wellbeing.
Screen time may remain relatively constant if children simply shift from social media to other online activities, such as online gaming. Ultimately, this is a challenge for the Albanese Government – how will it articulate that its policy is working?
3. Unintended consequences of the ban
While the ban is intended to protect young people from harms such as cyberbullying, sexual solicitation, and exposure to harmful content, some academics, mental health groups and young person advocacy groups oppose the ban, arguing that social media is also a critical space for positive social interaction, help-seeking, and access to information.
By restricting access, the social media ban risks removing a genuine tool for help-seeking and information gathering. For example, teenagers experiencing mental health challenges may use social media to connect with support groups, access resources, or reach out to peers facing similar issues. Without alternatives, these children may be left without accessible, safe avenues to seek help independently.
At the same time, restricting official social media channels may drive children towards less-regulated corners of the internet. Research from the Oxford Internet Institute and other studies suggest that restrictive measures often displace users rather than eliminate risk. Children may turn to unmoderated platforms, gaming sites with chat functions, or international social media services where harmful content is more prevalent, and protections are weaker.
YPG’s Thoughts
Internationally, Prime Minister Albanese has been praised for introducing the Social Media Minimum Age, and he is set to outline how the ban will work at the UN General Assembly in September 2025, as he urges other countries to adopt similar laws to protect children, warning that social media has a corrosive impact on youth globally.
The Federal Government is of course aware this is a multifaceted issue and there is no single solution when it comes to protecting the mental health of children and young people.
This is, however, the most significant policy reform in this space in decades and, arguably, is better than inaction.
8
September
2025

Building reputational resilience
Read news article
8
September
2025

Building reputational resilience
Download White Paper
21
August
2025

At a crossroads: Challenges facing Australia’s higher education sector
Read news article
21
August
2025

At a crossroads: Challenges facing Australia’s higher education sector
Download White Paper