[widget id="surstudio-translator-revolution-3"]

No likes before sixteen: Australia’s social media minimum age restriction

16 December 2025
Adrian Chotar, Partner, Sydney Dudley Kneller, Partner, Melbourne Sinead Lynch, Partner, Sydney Michael Owens, Partner, Brisbane Antoine Pace, Partner, Melbourne Mitchell Wright, Partner, Canberra

On 10 December 2025, Australia became the first country to enforce a minimum age requirement for social media accounts. This landmark reform introduces the Social Media Minimum Age obligation (SMMA) under Part 4A of the Online Safety Act 2021 (Cth) (OSA), requiring platforms to take reasonable steps to prevent users under 16 from creating or maintaining accounts.[1] Australia’s legislative change is significant because it’s the first of its kind to restrict access to social media platforms. It’s not only the strict age limit, but also the introduction of substantial civil penalties on social media providers that fail to comply. The introduction of SMMA has drawn a lot of attention domestically and internationally. The move is widely regarded as a global test case, with overseas regulators watching closely.[2]

The Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Bill) states that the objective of the new Part 4A in the OSA is to reduce the risk of harm to age-restricted users from certain kinds of social media platforms.[3] This article explains the scope of the law, compliance obligations, penalties, operational challenges, and why overseas regulators are watching closely.

1.     Breaking down the social media minimum age obligation

The SMMA requires social media platforms to take reasonable steps to ensure that social media users under 16 cannot create or maintain accounts. This obligation applies to services that meet the definition of an ‘age-restricted social media platform’, set out under the new section 63C, and includes providers whose[4]:

  • sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users;
  • allows end-users to link to or interact with other end-users;
  • allows end-users to post material on the service; or
  • material on the service is accessible to, or delivered to, end-users in Australia.

Importantly, the SMMA applies extraterritorially, and the OSA is drafted to capture providers of age-restricted social media platforms regardless of where they are based, so long as their services are accessible in Australia.

The definition is intentionally broad, but the Online Safety (Age-Restricted Social Media Platforms) Rules 2025 (Cth) allow for exclusions.[5] Platforms excluded include messaging-only services and gaming platforms, such as Discord, Messenger, Pinterest, YouTube Kids, Roblox, Steam and WhatsApp.[6] The eSafety Commissioner has also released guidance that identifies platforms that currently fall within scope, but the list is non-exhaustive and focuses on services with the largest number of Australian users under 16.[7]

The onus is on social media platforms, not parents or young people, to determine whether the obligation applies and to ensure protections are in place. Providers should undertake a self-assessment to determine applicability.

2.     Penalties for non-compliance with SMMA

 

Section of OSA[8] Requirement Penalty
Section 63D A provider of an age-restricted social media platform must take reasonable steps to prevent age-restricted users having accounts with the age-restricted social media platform.

 

30,000 penalty units for an individual (AUD 9.9 million)

150,000 units for bodies corporate (AUD 49.5 million)

Section 63DA A provider of an age-restricted social media platform must not collect information for the purpose of complying with section 63D, or for purposes that include complying with section 63D, if the information is of a kind specified in the legislative rules.

 

30,000 penalty units for an individual (AUD 9.9 million)

150,000 units for bodies corporate (AUD 49.5 million)

Section 63DB A provider of an age-restricted social media platform must not collect government-issued identification material, or use an accredited service (within the meaning of the Digital ID Act 2024), for the purpose of complying with section 63D, or for purposes that include complying with section 63D.

 

30,000 penalty units for an individual (AUD 9.9 million)

150,000 units for bodies corporate (AUD 49.5 million)

Section 63H A person must comply with a requirement under section 63G.

Under section 63G, if the eSafety Commissioner believes on reasonable grounds that a person is a provider of an age-restricted social media platform and has information relevant to compliance with the SMMA, the Commissioner may issue a written notice requiring that person to provide specified information within a set period, in the manner and form required.

500 penalty units for individual (AUD $165,000)

2,500 penalty units (AUD $825,000)

 

Enforcement powers include compliance notices, directions, and court orders under the Regulatory Powers (Standard Provisions) Act 2014 (Cth). The penalty amounts are significant and reflect the harms the SMMA intends to safeguard against.[9]

Importantly, there are no penalties for under-16s who access an age-restricted platform, nor for their parents or carers. eSafety has outlined that “this is about protecting young people, not punishing or isolating them.”[10] This is because eSafety does not view this legislative change as a ban but as a delay to having accounts.[11]

3.     Operational hurdles

For social media providers, there are considerable operational challenges. Under-16s may attempt to circumvent restrictions using VPNs, fake IDs, or by migrating to unregulated spaces such as gaming platforms.[12] Platforms must ensure they have age-assurance signals in place, including tracking account activity patterns, any interaction with child-targeted content, analysing use of language and visual checks using facial age.[13] However, these measures also raise further privacy concerns.

The SMMA addresses privacy by requiring that information collected for age assurance is only used for that purpose and must be destroyed after use. [14] No Australian will be compelled to use government identification for age assurance, and platforms must allow use of a reasonable alternative.

4.     Global ripple effect: How other countries are responding

Australia’s SMMA is already influencing international regulatory debates. Since the announcement:

  • Malaysia has announced an under 16 social media ban, to take effect in 2026, explicitly citing Australia’s model as an influence.[15]
  • New Zealand has also announced it’s preparing similar legislation to enforce age verification measures for social media platforms.[16]
  • On 25 July 2025, new rules came into force in the United Kingdom to strengthen protections for children using social media platforms.[17] Platforms must use more reliable age verification technologies and periodically re-check users’ ages. Ofcom overseas can investigate and fine non-compliant platforms. However, the minimum age for social media accounts in the UK remains at 13.
  • The EU’s Parliamentary Committee on Culture and Education is also examining the regulation of “kidfluencers” and commercial exploitation of minors.[18] On 26 November 2025, EU Parliament voted in favour of implementing age assurance to block people under the age of 16 from accessing social media.[19]
    • Several European countries have introduced restrictions on minors’ use of social media, including Italy, which requires parental consent for users under 14, and France, which imposes the same requirement for those under 15.
  • In the United States, Virginian law SB854[20], restricts under 16s usage by time and requires parental verification to increase or decrease daily limit of usage. However, constitutional limitations remain a challenge for broader implementation in the US.

As more countries begin to follow suit, Australia’s approach is increasingly being seen and commented on as a potential “blueprint” for compliance. The eSafety Commissioner has signed a joint pledge with the UK’s Ofcom, and the EU’s DG CNECT to share age assurance technologies and coordinate action on children’s online safety.[21] If Australia’s implementation and enforcement of the SMMA prove effective, it could set the benchmark for regulating access to social media globally.

For overseas firms advising on social media platform usage under Australia OSA, the key is to implement control measures. Platforms need to map their services against SMMA definitions and ensure they document any reasonable steps for compliance, including any age assurance processes.[22]

No doubt it will take several months to assess the effectiveness of the SMMA and its impact on users and affected platforms alike. There’s already no shortage of examples of tech-savvy under-age users ready to circumvent the laws once implemented.  The risk of not taking reasonable steps to restrict access to social media platforms is significant. Platforms face increasing compliance costs, operational challenges, and significant penalties if they fail to act or respond. The impact of the SMMA will have a lasting impact on the industry and regulation in Australia and beyond.  There is no better time to review your compliance strategies and services in light of these reforms.

If you found this insight article useful and you would like to subscribe to Gadens’ updates, click here.


Authored by:

Dudley Kneller, Partner
Megan Grimshaw, Graduate

 

[1] eSafety Commissioner, Social media age restrictions | eSafety Commissioner

[2] Bryon Kaye and Praveen Menon, Australia passes social media ban for children under 16 | Reuters, 29 November 2024

[3] Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Cth), Part 4A, Social media minimum age

[4] Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Cth), s 63C(1)(a)(iii)

[5] Including messaging, email, voice calling or video calling services; Online games; Services that primarily function to enable information about products or services; Professional networking and professional development services; Education and health services (Online Safety (Age-Restricted Social Media Platforms) Rules 2025, r 5 (Rules)

[6] eSafety Commissioner, Social media ‘ban’ or delay FAQ

[7] Ibid.

[8] Social Media Minimum Age Bill 2024 (Cth), Part 4A: Social media minimum age

[9] eSafety Commissioner, Social Media Minimum Age – Fact Sheet.

[10] eSafety Commissioner, Social media ‘ban’ or delay FAQ.

[11] eSafety Commissioner, Social media age restrictions

[12] Sam Buckingham-Jones, Teenagers find loopholes in Albanese’s youth social media ban, Australian Financial Review (Online, 14 October 2025); Clare Armstrong, Lemon8 and Yope put on notice as social media age ban targets, ABC News (Online. 2 December 2025).

[13] eSafety Commissioner, Social media ‘ban’ or delay FAQ.

[14] Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Cth), s 63F

[15] Malaysia says it plans to ban social media for under-16s from 2026, Reuters (Online, 24 November 2025)

[16] Eva Corlett, New Zealand’s prime minister proposes social media ban for under-16s, The Guardian (Online, 6 May 2025).

[17] GOV.UK, What’s changing for children on social media from 25 July 2025, press release.

[18] European Parliament, REPORT on the protection of minors online (A10-0213/2025), [58].

[19] European Parliament News, Children should be at least 16 to access social media, says MEPSs, (26 November 2025).

[20] SB854 Bill 2025, Virgina USA: Consumer Data Protection Act, social media platforms, responsibilities and prohibitions to minors.

[21] eSafety Commissioner, eSafety joins forces with the European Commission’s DG CNECT and the UK’s Ofcom to strengthen global cooperation on child online safety (Media Release, 2025).

[22] Explanatory Memorandum, Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Cth)

This update does not constitute legal advice and should not be relied upon as such. It is intended only to provide a summary and general overview on matters of interest and it is not intended to be comprehensive. You should seek legal or other professional advice before acting or relying on any of the content.

Get in touch