[widget id="surstudio-translator-revolution-3"]

Denmark to let you control the use of your voice and likeness. What about Australia?

21 July 2025
Antoine Pace, Partner, Melbourne

If your social media is dipping into uncanny valley territory, you’ve likely found a deepfake.

Elegantly described by the eSafety Commissioner, deepfakes are videos, images, or audio showing a real person “doing or saying something that they did not actually do or say”.  However, unlike the fakes of yesteryear, new AI technologies are not only making it easier for people to create deepfakes (sometimes automatically),[1] they’re making them more convincingly real as well.  This has greatly expanded the potential for deepfake-powered scams, mis- and disinformation, image-based abuse, exploitation, defamation, and political interference.

Recognising these risks, the Danish government has announced that it is preparing to amend Danish copyright law to grant individuals copyright-like protections over their own face, body, and voice.  This is a substantial departure from the general global understanding that copyright subsists in some kind of ‘work’ (be it written or otherwise) and may help Danes breathe a sigh of relief that their likeness is protected.

In this article, we reflect on whether Australian copyright law is equipped to handle deepfakes and consider whether other avenues can help affected individuals and businesses seek relief.

Copyright-ing to safety

We previously looked at whether Australian copyright law could protect voices, following OpenAI getting into some hot water over similarities between its ‘Sky’ voice and one Scarlett Johansson – spoilers, it doesn’t.

Unlike the newly proposed Danish model, and as with many copyright regimes globally, Australian copyright law protects the original expression of ideas and information in a ‘material’ form.  This means that an existing video, image, or audio recording of someone’s face, body, or voice may be protected under Australian copyright law.

And so, copyright law would only prevent ‘new’ deepfake content where the deepfake infringes the copyright in an existing video, image, or audio recording.  Even more alarmingly, the deepfake itself could arguably amount to an ‘original expression’ in a ‘material’ form, therefore itself being protected under copyright law given someone, somewhere, created the deepfake in the first place. We won’t go into the ins and outs of whether AI created content is actually a copyright work, or if so, who owns it. Look out for an article on that sometime soon! However, this might be a helpful protection of original art where deepfakes are used for a more ‘positive’ purpose – such as to enhance visual effects and storytelling in film and TV or to remember loved ones.

The fact that copyright does not give protection can be immediately problematic where a deepfake is used for harm – be it to harm or defame the individual depicted in the deepfake, or to mislead others.

In any case, copyright law would protect the work itself, with individuals depicted in that work being unable to protect the use of their likeness.

Privacy levers

Under Australia’s Privacy Act, recordings of audio, images, or videos of individuals are considered to be “personal information” if the individual is identified or reasonably identifiable.[2]  This means that the Australian Privacy Principles (APPs) can help protect a person’s likeness from privacy challenges, so long as the entity collecting, using, or disclosing that information is required to comply with the Privacy Act.

However, this may not go far enough in the context of deepfakes.  Specifically:

  1. individuals and certain businesses are exempt from complying with the Privacy Act if they have annual turnover below AUD$3 million.[3] As a result, it’s unlikely that cyber criminals or AI-hobbyists could be compelled to comply with the APPs in relation to any deepfakes they make (assuming they could be identified in the first place);
  2. people depicted in deepfakes are not able to enforce the Privacy Act directly, even if they were able to find an appropriate defendant. This is because regulation of privacy is handled by the Office of the Australian Information Commissioner, with individuals lacking a direct right of action to enforce the APPs; and
  3. there are other exceptions as well – for instance, registered political parties are not required to comply with the Privacy Act (and experience during the recent Federal election underscored that the politicians often exempt themselves from other compliance requirements, such as the SPAM Act).

As a result, while the Privacy Act might possibly offer some protection from information being misused, it does little to empower people to restrict the use of their likeness – particularly given it can be difficult to track down a perpetrator, and given the broad exemptions that also apply.

A statutory tort

There is some hope that Australia’s new statutory tort for serious interferences with privacy could empower people to take direct action against deepfake creators (or, potentially, publishers).

At the risk of oversimplifying the application of this tort, recourse here would primarily depend on:

  1. whether the deepfake invaded the individual’s privacy (such as by ‘misusing’ information relating to them – for example, their likeness) in a serious way;
  2. whether the individual reasonably expected privacy (meaning, for example, the scales may shift for public figures); and
  3. the applicability or otherwise of defences (including in respect of a ‘public interest’ test, and certain use-cases like journalism or law enforcement).

Perhaps most importantly, however, is whether someone could identify who created the deepfake in the first place in order to bring a claim against them.  For example, while some deepfake content might contain metadata connecting it to someone’s IP address, this traceability can easily be obscured.

Moreover, when deepfakes are widely disseminated, it becomes increasingly difficult to identify the individual ultimately responsible for their creation – the “patient zero”, if you will.  This raises a yet to be tested question of whether a person who republishes a deepfake could be found liable in respect of this new statutory tort, in much the same way as they could be under Australian defamation law.

Misleading or deceptive conduct

Largely beating the same drum, the Australian Consumer Law (ACL) might provide some additional relief, but only where deepfakes are used in certain commercial contexts – not in respect of the general use of an individual’s likeness.

The core protection here comes in the form of the prohibition on misleading or deceptive conduct in section 18 of the ACL.  While this would ordinarily stop someone from creating a deepfake to mislead or deceive, the prohibition will only apply if the making or dissemination of that image was done in trade or commerce.  While this is not necessarily a high hurdle, it is a hurdle nonetheless.

This may mean relief is available if, for instance, someone attempts to commercialise their deepfake creation skills or uses deepfakes to run misleading ads or promotions.  Indeed, the ACL also prohibits false or misleading endorsements and sponsorships (such as by influencers or celebrities) or descriptions of products’ benefits or performance.[1]  However, this would not extend to the private use or distribution of deepfakes and does not grant individuals a general ability to ensure their likeness is not misused.

What’s next?

While it’s unclear whether Australia will follow Denmark’s innovative approach to protecting our likeness, it’s certainly clear that more needs to be done to ensure our regulatory regime empowers Australians to control how they are reflected online.

In saying this, care should be taken to ensure any future regulation permits innovation, while ensuring individuals are protected from the significant harms that can come from their likeness being misused.

If you found this insight article useful and you would like to subscribe to Gadens’ updates, click here.


Authored by:
Antoine Pace, Partner
Chris Girardi, Associate
Claudia Campi, Seasonal Clerk
Charlotte Cheng, Seasonal Clerk


[1] Competition and Consumer Act 2010 (Cth) Sch 2 s 29(1)(g).
[2] Privacy Act 1988 (Cth) s 6.
[3] Privacy Act 1988 (Cth) s 6D.

This update does not constitute legal advice and should not be relied upon as such. It is intended only to provide a summary and general overview on matters of interest and it is not intended to be comprehensive. You should seek legal or other professional advice before acting or relying on any of the content.

Get in touch