Taylor Swift and the estate of the late George Carlin are both at the center of respective battles in which artificial intelligence was used to recreate their likenesses.
Earlier this week, pornographic deepfake photos of Swift created by AI were widely distributed, reaching 47 million views on X and on numerous other social media sites. NBC News reported that sites like X have been known to be slow to take down sexually explicit deepfakes, but in this case, the images were eventually removed because of a mass-reporting campaign spearheaded by Swift's fans.
In a statement, X said, “Our teams are actively removing all identified images and taking appropriate actions against the accounts responsible for posting them.”
Rep. Yvette Clarke said on X, “What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes [without] their consent. And [with] advancements in AI, creating deepfakes is easier & cheaper. This is an issue both sides of the aisle & even Swifties should be able to come together to solve.”
As some individual states like California, Texas and Virginia have adopted some laws to criminalize deepfakes, there is a growing push in Congress to establish a federal law. Last year, Rep. Joseph Morelle proposed the Preventing Deepfakes of Intimate Images Act which would criminalize sharing deepfake pornography without consent," The Guardian reported.
Morelle said the deepfakes “can cause irrevocable emotional, financial, and reputational harm – and unfortunately, women are disproportionately impacted."
Another lawmaker Rep. Tom Kean Jr. is also introducing his own AI Labeling Act. If the proposal is enacted as law it would require all AI-generated content to be labeled as AI.
However, this is not the first time this has happened to a celebrity and certainly won't be the last as the unregulated technology continues to rapidly develop. Any person whose image is accessible online could become the victim of those who want to create pornographic deepfakes of them. The 17-year-old "Doctor Strange and the Mulitverse of Madness" star Xochitl Gomez said she has been unable to remove explicit fakes of herself from the internet. Even social media influencers and online personalities have deepfake nudes circulating. And while these people may not have the clout to effect change, the collective power of Swifties have forced the issue to light.
Reviving the dead through AI
Even people that are no longer alive have been subject to AI manipulation. George Carlin, who died in 2008, is now also involved in a legal battle against the creators of a YouTube comedy special "George Carlin: I’m Glad I’m Dead," which was released on Jan. 9. Carlin's estate is suing the creators for using artificial intelligence to copy the deceased comedian’s voice and style of humor, The Hollywood Reporter said.
The lawsuit which was filed on Thursday, claims that the creators of the comedy special used Carlin's entire body of work to train an AI chatbot without consent or compensation. The suit also raises issues with using the comedian's voice and likeness for the promotion of the work.
Also, the suit asked for an immediate removal of the special and damages. Carlin's estate is one of the first to file a lawsuit on behalf of a deceased celebrity for unlicensed use of their work and likeness to manufacture a new, AI-generated creation. Nonconsensual usage of AI was one of the defining issues of the SAG-AFTRA strike in 2023.
Carlin's daughter, Kelly Carlin, said on X after the release of the special that it “stepped over a line in the world of comedy today that will surely affect dead artists and their estates now.”
Currently, there are no federal laws in the U.S. to protect the likeness of a person from being copied by AI.
Shares