AI-generated porn, including celebrity fake nudes, persist on Etsy as deepfake laws 'lag behind'

AI-generated porn, including celebrity fake nudes, persist on Etsy as deepfake laws 'lag behind'

  • Post category:Politics

AI-generated porn, including celebrity fake nudes, persist on Etsy as deepfake laws 'lag behind'

Etsy, the online retailer known for providing a platform to sell hand-made and vintage products, continues to host sellers of “deepfake” pornographic images of celebrities and random women despite the company’s efforts to clean up the site.

The proliferation of sexually explicit images generated by artificial intelligence (AI) — including depictions of celebrities — on an otherwise innocuous marketplace comes as a shock to many experts. The problem has persisted on the platform for months. 

“That sounds like a total innocuous platform for people to do this. Usually we find a lot of explicit content on Twitter, or some other particular portals for that kind of materials,” Siwei Lyu, a computer scientist and expert on machine learning and the detection of deepfakes, told Fox News Digital. “But polluting the space on such a platform [like Etsy] is quite hard to believe.”

NEW YORK MAN CHARGED FOR ALLEGEDLY SMUGGLING $200K OF RARE BUTTERFLIES, SELLING THEM ON EBAY, ETSY

The explicit content isn’t hidden, either. Fox News Digital found that a simple search of “ai nude” or “deepfake porn” on Etsy each yielded 1,000+ results, while a search for “porn” turned back zero results. Even without a narrowed search, some explicit-AI generated items appeared in the “You may also like” algorithm while searching for other unrelated items. 

While not all the results showed pornographic content, many of the nude images are of entirely fabricated women created by AI. One shop even sells an e-book guide on how to create X-rated AI content. 

Ninety-five photos of Margot Robbie were also being sold for $10.95 by a shop with the description “Margot Robbie Nude Photorealistic AI Celebrity Nude Art – | NSFW Ai Girl | AI Girl | Celebrity Nude” as of Thursday

Several of the display photos were actual images of the “Wolf of Wall Street” actress, while others were fully nude AI-generated photos of her. One of the AI-generated photos of Margot Robbie depicted the actress in a sexual act. Robbie’s team did not immediately respond to Fox News Digital’s requests for comment.

GOODLE GEMINI BACKLASH EXPOSES COMMENTS FROM EMPLOYEES ON TRUMP, ‘ANTIRACISM’ AND ‘WHITE PRIVILEGE’

Another block of AI photos for sale were labeled “Ariana Grande Photorealistic AI Nude Art – Seductive and Sensual Digital Artwork | NSFW Ai Girl | AI Girl | Cosplay.” The stack of preview photos featured suggestive photos of the singer, but none showing her in the nude. Fox News Digital also reached out to Grande’s team. 

After Fox News Digital requested comment from Etsy, the listings were removed for violating policies. But some of the photos had already been bought and downloaded. 

“AI, especially generative AI technologies, these days are powerful enough to create very realistic images, and a lot of people fall for that, because they do not understand this,” Lyu said. 

Lyu explained that generative AI, a deep learning concept in AI software that allows the system to learn and develop over time to improve its results, makes it “technically challenging” to identify fake images. 

And now that this software has become widespread for personal use, it makes it more difficult for laws to catch up to potentially nefarious activity, such as the creation of celebrity deepfakes or AI-illustrated pornography. Lyu said the current laws “lag behind” the rapid development of AI. 

Blake Klinkner, an assistant law professor at the University of North Dakota teaching cybersecurity law, has been studying the emerging intersection of law and AI. He said the First Amendment covers a host of creative liberties, and federal laws haven’t yet weeded out potentially criminal AI-generated images.

SILICON VALLEY PROGRAMMERS HAVE CODED ANTI-WHITE BIAS INTO HALF THE AI CHATBOTS TESTED

“It’s very accessible for individuals to take really any image or video or audio and run it through a software program to create a deepfake,” Klinkner told Fox News Digital.  

He added that photos of celebrities are run through AI software “which learns how to map the person’s face, or even videos, and then create these obscene, pornographic images.”

“So you’re seeing these images sold on Etsy online, and a relatively simple search can, unfortunately, lead to a lot of results of individuals selling these images,” Klinkner said. “And again, because AI is relatively affordably accessible these days, there’s not a whole lot stopping individuals from making these images and even trying to sell them.”

Klinkner added that when these cases are brought forth, judges are often unfamiliar with deepfakes and are hesitant to apply old laws to a fairly modern problem.

“They can’t really wrap their heads around a computer taking an image and creating images that look like you that really aren’t you,” he said.

Users who create such images are hard to track down, too, as many use aliases and fake photos on their profiles. Klinkner said it creates “a Whack-A-Mole situation” where if one account gets taken down, a new account can just be made.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

In December, Etsy swept some of its marketplace after a Forbes report exposed 16 accounts that were selling explicit celebrity deepfakes. Two of the celebrity victims were actresses Olivia Munn and Jenna Ortega. Etsy removed the accounts. 

“We are deeply committed to the safety of our marketplace and community. While some mature content is allowed on Etsy, we have long prohibited pornography and we closely monitor our marketplace to identify and remove content that violates our policies,” Alice Wu, head of trust and safety at Etsy told Fox News Digital in a statement. “Evolving our policies and expanding our enforcement efforts related to mature content — including in emerging areas like AI deepfakes — is a key priority for Etsy this year.”

But thousands of pornographic images — accessible on the website by anyone regardless of age — remain easily available on the platform. 

According to Etsy’s rules, pornography “of any sort” is prohibited, while “mature content is restricted.” The company claims that it takes a “liberal approach” to what constitutes pornography. Etsy allows some nude images for artistic purposes. 

“Although pornography can be difficult to define, an item generally qualifies as pornography when it contains printed or visual material that explicitly describes or displays sex acts, sex organs, or other erotic behavior for the purpose of sexual arousal or stimulation,” the rules state.

TAYLOR SWIFT AI-GENERATED EXPLICIT PHOTOS OUTRAGE FANS: ‘PROTECT TAYLOR SWIFT

Last month, singer Taylor Swift made headlines when deepfake images of her made their rounds on social media. A few days later, Congress introduced legislation to crack down on the creation of such images that appear identical to actual celebrities. 

The No AI FRAUD Act, introduced by Rep. Maria Salazar, R-Fla., last month, would issue penalties for users who create generative AI images to harm individuals, whether they are public figures or not.

Only a few states have laws on the books penalizing nonconsensual deepfake porn: Georgia, Hawaii, Texas and Virginia. In California and Illinois, victims can sue perpetrators who create explicit images in their likeness. 

Andrew W. Torrance, a distinguished professor at the University of Kansas Law School, told Fox News Digital in an interview that while there is protection under the First Amendment for creating images or videos of others, particularly public figures, there are limitations. Parody is protected, but if the depiction is too realistic, it may infringe upon rights such as privacy or the right of publicity, especially for celebrities, or amount to libel or fraud.

CLICK HERE TO GET THE FOX NEWS APP

“A celebrity such as Margot Robbie, in California . . . and some other states, has special rights to her public image, because she’s a celebrity,” Torrance told Fox News Digital. “And when you meddle with that public image, when you start to make her look the way that she doesn’t want to look or sound, or espouse opinions that she doesn’t actually have, you can start to infringe on this right of publicity, and she might have a right to sue to prevent you from doing that and to take it down.”

“I think this is why things are coming to a head — because we’ve reached a point where it truly is difficult to tell between the generated and the real, and the amount of damage that is possible by fooling a large number of people is tremendous,” he said. 

by FOXNews