Fake news? – ACMA to receive new powers to hold digital platforms to account

28 June 2023
Sinead Lynch, Partner, Sydney Adam Walker, Partner, Melbourne

The Australian Government has set its sights on combatting the growing trend of misinformation and disinformation on digital platforms, including social media channels, with the release of the draft Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2023 (Draft Bill) last Sunday. If implemented as proposed, the Australian Communications and Media Authority (ACMA) would be granted new and enhanced powers to combat harmful misinformation and disinformation online, and to impose significant financial penalties for non-compliance with the new restrictions.

The Draft Bill has been introduced following the Australian Competition and Consumer Commission (ACCC’s) Digital Platforms Inquiry in 2019, which highlighted the significant risks posed by the ‘infodemic’ of misinformation and disinformation shared on digital platforms, and post the introduction of the Government-requested voluntary code of conduct for disinformation and news quality by certain players.

Defining Digital Platforms

The Draft Bill proposes substantial amendments to the Broadcasting Services Act 1992 (Cth) – the legislation establishing ACMA – to insert a new schedule of provisions related to digital platform services.

The Draft Bill takes a three-layered approach to defining the ‘digital platform services’ that are within the scope of its powers, which allows for any potential future legislative reform related to digital platforms.

These services are broadly defined, referring to digital services that:[1]

  1. collate and present content from a range of online sources (content aggregation services);
  2. enable online interaction between multiple end-users (connective media services);
  3. provide audio-visual or moving visual content to end-users (media sharing services); and
  4. other digital services specified by the Minister.

As a result, a significant category of broad digital platforms and digital platform service providers will be in scope – ranging from social media channels to search engine sites and peer-to-peer marketplaces. However, digital services will not be captured to the extent that they are internet, SMS or MMS service providers.[2]

Defining Misinformation and Disinformation

Although we may use misinformation and disinformation interchangeably, the Draft Bill clarifies that ‘intent’ is key:

  • misinformation‘ is online content that is false, misleading or deceptive, that is shared or created without an intent to deceive, but that is reasonably likely to cause or contribute to serious harm.[3]
  • disinformation‘ is a subset of misinformation that is disseminated deliberately or with an intent to deceive or cause serious harm.[4]

In seeking to balance freedom of expression with the need to address online harm, the Draft Bill outlines a number of exceptions, including:

  1. content produced in good faith for entertainment, parody or satire;
  2. professional news content and authorised electoral content;
  3. content authorised by the Commonwealth, State or Territory governments; and
  4. content produced by or for accredited education providers.

A shift from self-regulation

Digital platforms currently self-manage approaches to misinformation or disinformation. This may take the form of collective industry codes of practice, such as the DIGI disinformation code of practice – a code that major technology companies such as Microsoft, Twitter, TikTok and Google have signed up to in recognition of their ‘role as important actors within the Australian information ecosystem’.

The Draft Bill gives ACMA the power to step in where industry-led self-regulation is determined as inadequate, or where it fails to remedy the growing trend of misinformation and disinformation. Specifically, the proposed powers would enable ACMA to:

  1. gather information from, or require digital platform providers to keep records about matters regarding misinformation and disinformation;[5]
  2. publish information on its website relating to misinformation or disinformation regulation, measures to combat this issue, and the prevalence of such content – both on individual platforms and at an industry level;[6]
  3. request the industry develop ‘misinformation codes’ – codes of practice covering measures to combat misinformation and disinformation (that ACMA could then register and enforce); and[7]
  4. create and enforce ‘misinformation standards’ – industry standards (a stronger form of regulation than a code of practice) where ACMA deems a code of practice is ineffective.[8]

Notably, ACMA will not have the power to request specific content or posts to be removed from digital platforms or have a role in determining what is considered truthful. Digital platforms continue to be responsible for the content they host and promote to users.

This seeks to recognise the challenging balance between the desire to ensure free speech online, the role of digital platform providers in determining, and being responsible for, the quality and nature of content on their own platforms, and the safety risks posed by certain forms of online content – such as those already regulated by the eSafety Commissioner. Further, ACMA will not be able to use its powers in relation to private messages[9] – save for, perhaps, its information gathering powers to ensure service providers collect information about key risks.

Nonetheless, it is clear that the proposed legislation is designed to reserve the ability for ACMA to force platforms into line where self-regulatory codes and practices have failed.

Serious harm

For misinformation to be covered by the Draft Bill, it must be ‘reasonably likely that it would cause or contribute to serious harm’. For harm to be serious, it is intended that it must have severe and wide-reaching impacts on Australians. Examples provided include inciting hatred, vandalising critical communications infrastructure, serious financial or economic harm or serious harm to the health of Australians. It would appear to be focused on misinformation shared socially, rather than professionally – for example, conspiracy theories – rather than information that is accidentally incorrect despite a publisher’s best intentions.


Digital platforms that do not comply will face substantial penalties – up to, the greater of, AUD$6.88 million or 5% of global turnover for corporations (in recognition of the size of digital service providers), and up to $1.38 million for individuals.[10] This is in addition to warnings, remedial directions and other ‘softer’ remedies available at ACMA’s discretion.

What’s Next?

The Draft Bill is currently open for consultation until 6 August 2023, with the Government seeking industry feedback on the scope and complexity of the amendments.

We expect further information regarding these amendments to be released following the consultation period, but note they will likely fit within a suite of broader regulatory interventions regarding large digital service providers, including the News Media and Digital Platforms Mandatory Bargaining Code and the ongoing ACCC’s Digital Platform Services Inquiry. Please do get in touch if you would like to know more or to contribute to the consultation.

If you found this insight article useful and you would like to subscribe to Gadens’ updates, click here.

Authored by:

Sinead Lynch, Partner
Adam Walker, Partner
Chris Girardi, Lawyer

[1] Sch 9, s 4(1)

[2] Sch 9, s 4(1)(e)-(f)

[3] Sch 9, s 7(1).

[4] Sch 9, s 7(2).

[5] Sch 9, ss 14-14.

[6] Sch 9, ss 25-28.

[7] Sch 9, ss 29-44.

[8] Sch 9, ss 45-56.

[9] Sch 9, ss 14(3), 19(4), 34.

[10] Inserted s 205F(5H) into the Broadcasting Services Act 1992 (Cth)

This update does not constitute legal advice and should not be relied upon as such. It is intended only to provide a summary and general overview on matters of interest and it is not intended to be comprehensive. You should seek legal or other professional advice before acting or relying on any of the content.

Get in touch