Artificial Intelligence “Stealing” Artists’ Voices. The Death Knell of an Industry?

This article explores what actors and singers can do if they find their voice has been cloned and used without their permission, and the legal basis for such an action.

Background:

OpenAI recently launched their AI chatbot, named Sky. The voice used sounded remarkably similar to Scarlett Johansson, who had previously voiced an artificial intelligence (“AI”) chatbot in the award-winning film “Her”.

OpenAI approached Johansson several times before Sky was launched but she refused to give permission to use her voice. They proceeded to release Sky with a voice sounding very like the actor’s. It was so similar, in fact, that friends and family of Johansson, as well as OpenAI users, could not tell the difference between them.

OpenAI has denied using samples of Johansson’s voice to produce the voice for Sky, however, they have refused to release the name/identity of the alleged actor, “…to protect their privacy.

Johansson has not yet initiated proceedings. Other actors have had their voices mimicked by producers (using AI) for commercial products they are not connected to, nor have any idea about.

The Sky case will be dealt with under US law, but there are cases in England and Wales where actors and voice-over artists have had their voices mimicked or cloned by AI, and abused thereafter for commercial purposes. This new threat is highly distressing to voice-over actors, who have already found that AI is being implemented by the likes of the BBC to replace them entirely.

Does the Law offer protection to artists?

Passing Off

Currently, there is no statute prohibiting someone from replicating your brand or image rights. The law has been developed over many years by relevant case law, so applying the case law to the facts where an AI tool has cloned an artist’s voice would require some creativity. While the US have introduced law to protect unauthorized use of artists’ voices and likeness from AI, the UK has not. AI-generated work can be protected (pursuant to section 178 of the Copyright, Designs and Patents Act 1988) on the basis “…that there is no human author of the work”.

The tort of passing off is likely to be the strongest cause of action to prevent companies cloning artists’ voices within England and Wales (on the basis that the sound recordings used to train the generative AI will probably be owned by a third-party). Passing off is a civil wrong and occurs when a trader misrepresents their goods/services as those belonging to another. In the case of voice actors, their voice is their goods/services, the defendant-trader would likely be a production company/videogame developer, and the misrepresentation will be that the production company claims that the voice is the artists’ (despite them not obtaining any consent/authorization, and the voice was cloned with AI).

There are three elements to a claim (and all must be established):

1.     Goodwill

 As an actor, you must have goodwill as a voice artist (that is, a reputation for having a distinct voice that brings value to a production). Take Stephen Fry as an example. He recently had his voice replicated from his work narrating the Harry Potter audio books. His voice is well known, recognizable/distinguishable, and will bring an audience to a production. Goodwill can be the hardest element of a successful passing off claim. Due to his fame, the goodwill would be relatively easy to substantiate.  

2.     Misrepresentation

 This usually occurs when a defendant-trader represents that their goods/services are the same as the claimant-trader’s. In the case of voice actors, their voice is the “goods/service.”

The misrepresentation can be express or implied. For actors with voices so distinctive, then their voice being used alone could be implicit misrepresentation. If the production company misusing the actor’s voice credits that actor in the film, and advertises the production includes the actor’s narration, then this would be express misrepresentation.

 3.     Damage  

This can be a damage to the artist’s reputation (goodwill), or even loss of income as a consequence of the cloning (lost sales/contracts). The damage must be due to the misrepresentation.

 For example, with the example of Stephen Fry, if the AI cloned narration was shoddy, and associated with a poor-quality production – then viewers who are not aware that the voice over was cloned may think less of Stephen Fry as a performer, and this could have further implications for his career.

Remedies

  1. Damages for harm to reputation and/or loss of profit.
  • An account of the defendant’s profits, so you may receive a share of the ill-gotten gains, as a consequence of the defendant using your voice without permission.
  • An order for the delivery up or destruction of the infringing articles or products.
  • Injunctive Relief.

An injunction is a court order prohibiting a person from taking a particular action, or requiring them to take a particular action. You can also apply for an interim injunction, before proceedings are issued, if swift action is needed. In the matter of voice cloning, the relief sought could be to take down/remove the media including your voice for the foreseeable future. Alternatively, it could be that your voice is replaced immediately. Luckily, in video games, this can be achieved via software updates, as the majority of online games are regularly serviced by programmers to ensure that it is free from computing/coding errors.

The Videogame Industry

Voice-over artists in the video game sector, even those credited to some popular games, including Baldur’s Gate 3 and Starfield, have found their voices have been cloned and uploaded on multiple websites, without their consent, by third parties. While some have responded by removing the cloned voices, others have refused. 

Some artists will be protected from abuse by their unions, but smaller actors without representation will find the only way to assert their rights and prevent further abuse is to engage lawyers.

Costs

If there is a claim, and pre-action correspondence has failed to recoup any losses or relief (including the removal of the offending works), then the claim will be heard in the Intellectual Property Enterprise Court (part of the High Court). The fees for such will be based on the amount being claimed:

Claim amount up to        Court fee

£300                                        £35

£500                                        £50

£1,000                                     £70

£1,500                                     £80

£3,000                                     £115

£5,000                                     £205

£10,000                                   £455

£200,000                                 5% of the amount

£500,000                                 £10,000

 

If it is less than £10,000.00, then it will be in the small claims track, and new legislation has made mediation mandatory. This will incur fees for a mediator, as well as any legal representation present.

 

Jurisdiction

If you are a voice actor/artist, and your voice has been cloned and used for commercial purposes by a company (such as a production company/videogame developer), it is worth confirming where the company is based.

If the company is based in America, you may have more causes of action that are clearly defined. America has the right of publicity, copyright protections, and biometric privacy rights in certain states. You must seek independent legal advice from an Attorney qualified in the relevant state.

What you can do

The use of AI can assist companies to produce content for lower costs (at the risk of not owning the copyright/IP), however, it is being abused to deprive artists of their moral rights and potential earnings. In the long term, if actors want their rights protected from the developments of AI, they must litigate. If you believe your voice has been stolen and you require assistance enforcing your rights and receiving compensation, seek legal guidance.

 

Griffin Law is a dispute resolution firm comprising innovative, proactive, tenacious and commercially-minded lawyers. We pride ourselves on our close client relationships, which are uniquely enhanced by our transparent fee guarantee and a commitment to share the risks of litigation.  For more details of our services please email justice@griffin.law or call 01732 52 59 23.

GRIFFIN LAW – TRANSPARENT FEES. TENACIOUS LAWYERS. TRUSTED PARTNERS.

Nothing in this document constitutes any form of legal advice upon which any person can place any form of reliance of any kind whatsoever. We expressly disclaim, and you hereby irrevocably agree to waive, all or any liability of any kind whatsoever, whether in contract, tort or otherwise, to you or any other person who may read or otherwise come to learn of anything covered or referred to in this document. In the event that you wish to take any action in connection with the subject matter of this document, you should obtain legal advice before doing so.

 

By |2024-07-24T20:07:41+01:00July 19th, 2024|Brand and Reputation Protection, Business Disputes, Litigation, Litigation Funding|Comments Off on Artificial Intelligence “Stealing” Artists’ Voices. The Death Knell of an Industry?

Share This Story, Choose Your Platform!

About the Author:

Go to Top