The Briefing by the IP Law Blog
Deep Dive into the NO FAKES Act
A group of senators introduced an update to the ‘No Fakes Act,’ which protects the voice and visual likeness of individuals from unauthorized AI-generated recreations. Scott Hervey and James Kachmar discuss the changes to this act on this episode of The Briefing.
Watch this episode on the Weintraub YouTube channel here.
Show Notes:
Scott:
Senators Chris Coons, Marsha Blackburn, Amy Klobuchar, and Thom Tillis introduced an update to the ‘Nurture Originals Foster Art and Keep Entertainment Safe Act’ or the ‘No Fakes Act,’ which the four senators previously released last October. I’m Scott Hervey, and I’m joined today by James Kachmar, and we’re going to talk about the ‘No Fakes Act’ or the update to the ‘No Fakes Act’ on this installment of The Briefing.
James, welcome back to The Briefing. It’s been a while.
James:
Good to see you, Scott. Thanks for having me.
Scott:
We have a fun one today, the ‘No Fakes Act.’ The purpose and intent of the ‘No Fakes Act’ is to prevent the creation and use of a digital replica of an individual without that person’s consent. Let’s dive into how this proposed act accomplishes this and what the liabilities are for violations of the act. First and foremost, the act creates a new federal property right to authorize the use of a person’s voice or visual likeness in what’s called a digital replica. Now, a digital replica is defined in the act as a newly created, computer-generated highly realistic electronic representation that is readily identifiable as the voice or visual likeness of an individual. Now, this right, the right to control a digital replica or grant rights in a the original replica, survives postmortem and is transferable, licensable, and exclusive to the individual, the executors, the heirs or licensees or devices of that individual for an initial ten years, renewable for a rolling five-year period with a cap of 70 years. That’s postmortem. As I said, this right is licensable. Interestingly, the act says that a license can only have a term of ten years.
James:
Okay, Scott, why don’t we look at what the essence of the act is? It basically creates liability for one, the production of digital replica without consent of the applicable right holder, and two, publishing, reproducing, displaying, distributing, or transmitting, or otherwise making available to the public a digital replica without the consent of the applicable right holder, where such acts affect interstate commerce. It is not a defense to liability if the defendant displayed or publicly communicated a disclaimer stating that the digital was unauthorized or generated through artificial intelligence. Liability requires actual knowledge through either the receipt of notice from the right holder or a person authorized to act on behalf of the right holder or an eligible plaintiff, or from the willful avoidance of actual knowledge that the material is an unauthorized digital replica.
Scott:
Now, the act allows for a private right of action by the rights holder, and it also allows for a private right of action by any other person that controls, including through a license, the right to exercise or the right to use the rights holder’s voice or likeness. The act is not clear whether this license needs to be exclusive in order to sue for a violation of the act, like under copyright or if it can be non-exclusive and still have the right to sue.
James:
The act also allows for a private right of action for record labels. In the case of digital replica involving either a sound recording artist who has entered into a contract for their exclusive sound recording artist services, or any artist who has entered into an exclusive license to distribute or transmit one or more of their album or works that capture their performance. This is similar to what you and I had talked about some time ago about Tennessee’s Elvis Act.
Scott:
Right, it is.
James:
There’s also a three-year statute of limitations period for bringing lawsuits for violations of the act. The act provides for monetary relief injunctive relief, punitive damages for a willful violation, as well as attorney’s fees.
Scott:
The act also establishes separate liability for online service providers that participate in the making of a digital replica, so take no generative AI service providers, or make a digital replica available on an online service unless the online service has taken reasonable steps to remove or disable access to the unauthorized digital replica as soon as it is technically and practically feasible for the online service actor acquiring actual knowledge that the material is an unauthorized digital replica. So similar to the DMCA, the Digital Millennium Copyright Act, the No Fakes Act establishes a notice and takedown process. Also similar to the DMCA, liability for false takedown notices under the No Fakes Act.
James:
That’s right, Scott. And there are some exclusions that are built into the act, which are based on recognized First Amendment protections. For instance, it’s not a violation of the act when a digital replica is used in a bonafide news, public affairs, or sports broadcast or account, if the digital replica is the subject of or materially relevant to the subject of the broadcast or account.
Scott:
And similarly, it will not be a violation of the act where a digital replica is of an individual that is portrayed in a documentary or in a historical or biographical manner. Now, this can include some degree of fictionalization unless the production or use is intended to and does, in fact, create the false impression that the work is an authentic work and that the person portrayed by the digital replica actually participated in the work. Whether on camera or through a sound recording.
James:
Right. Similar to what we see in the copyright field, it is also not a violation of the act where the digital replica is used consistent with the public interest, either in bona fide commentary, criticism, scholarship, or satire and parody, or if it is used in a fleeting or negligible manner. Lastly, it will not be a violation of the act if the digital replica is used in an advertisement or commercial announcement for any of the above examples.
Scott:
That’s interesting, right? That’s a mix of write-up publicity statutes and copyright. The bill also includes a safe harbor from liability for AI technology companies that create a technology product that creates digital replicas unless such product is, one, primarily designed to produce one or more unauthorized digital replicas, and two, has only limited commercially sufficient purposes for use other than to produce an unauthorized digital replica, or three, is marketed, advertised, or otherwise promoted by that person or another acting in concert with that person, with that person’s knowledge for use in producing an unauthorized digital replica. It’s going to be interesting to see if the Senate will pass this bill, given where we are now in the election cycle. Now, the Copyright Office has recently issued a report basically cautioning that there is an urgent need for new federal legislation to address the proliferation of deepfakes created through the use of artificial intelligence. The report, it’s a great report, and anybody who’s interested in the subject should read it. The report analyzed the existing legal framework through which digital replicas can be addressed and pointed out their shortcomings. Perhaps the Senate will take this bill under consideration sooner rather than later.
James:
Right, Scott. We’ll just have to wait and see what the Senate does with the No Fakes Act update, and maybe you and I can get back together for another podcast here soon on this topic.
Scott:
That’s all for today’s episode of The Briefing. Thank you, James, for joining us. Thank you, the listener or the viewer, for tuning in. We hope you found this episode to be informative and enjoyable. If you did, please remember to subscribe, leave us a review, and share this episode with your friends and colleagues. If you have any questions about the topics that we covered today, please leave us a comment.