A bipartisan bill seeks to create a federal law to protect actors, musicians, and other performers from unauthorized digital replicas of their faces or voices.
The Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023 — or the No Fakes Act — standardizes rules around using a person’s faces, names, and voices. Sens. Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC) sponsored the bill.
It prevents the “production of a digital replica without consent of the applicable individual or rights holder” unless part of a news, public affairs, sports broadcast, documentary, or biographical work. The rights would apply throughout a person’s lifetime and, for their estate, 70 years after their death.
The bill includes an exception for using digital duplicates for parodies, satire, and criticism. It also excludes commercial activities like commercials as long as the advertisement is for news, a documentary, or a parody.
Individuals, as well as entities like a deceased person’s estate or a record label, can file for civil action based on the proposed rules. The bill also explicitly states that a disclaimer stating the digital replica was unauthorized won’t be considered an effective defense.
The No Fakes Act essentially federalizes likeness laws, which vary from state to state. (Some states don’t have ground rules around the right to publicity at all.) New York is one of the few states that explicitly mentions digital replicas and prohibits the use of a deceased person’s computer-generated replica for scripted work or live performances without prior authorization.
The proliferation of generative AI tools that mimic voices or create photos featuring famous people has brought new attention to likeness laws. Earlier this year, a song featuring Drake and The Weeknd went viral on TikTok and then on YouTube. But it turned out the song used AI versions of both artists without their permission.
Some music industry insiders see likeness rules as a good way to address musicians’ concerns that their voices could be used to release AI-generate songs without their consent. However, the fragmentation of likeness laws makes protecting artists’ right to publicity difficult across state lines.
AI duplicates also became a hot-button issue after SAG-AFTRA revealed Hollywood studios proposed using digital scans of actors.
The Recording Industry Association of America (RIAA), which recently called on the US government to include AI voice cloning websites as part of its list of online piracy markets, said it welcomes the bill. “Our industry has long embraced technology and innovation, including AI, but many of the recent generative AI models infringe on rights — essentially instruments of theft rather than constructive tools aiding human creativity,” the RIAA said in an emailed statement to reporters.
Another group, the Human Artistry Campaign, said in a statement that while it believes AI can provide tools that unlock human creativity, it believes it can steal copyrighted material and use names and likenesses of artists without permission, which it dubs “incredibly harmful to society.”
However, others worried the No Fakes Act only dresses current laws in new clothes. Jeremy Elman, a partner at law firm Duane Morris, said the proposed bill “does not appear to offer protections beyond existing copyright or right of publicity law, and could pose thorny issues regarding those well-established rights down the road.”
“Regulating AI is certainly at the top of the list for lawmakers these days, but they should be careful not to rush into creating a new federal IP rights that may conflict with long-standing balances in the IP system,” he said.
0 Comments