It’s too easy to trick Lensa AI into creating NSFW images • TechCrunch

Google+ Pinterest LinkedIn Tumblr +



It's too easy to trick Lensa AI into creating NSFW images • TechCrunch

Lensa has climbed the App Store hit lists with its avatar-generating AI that has artists waving the red flag. Now, there’s another reason to fly the flag, as it turns out that it’s possible – and far too easy – to use the platform to generate non-consensual soft porn.

TechCrunch has seen photo sets generated with the Lensa app, which include images with clearly visible breasts and nipples in images with recognizable people’s faces. It seemed like the kind of thing that shouldn’t have been possible, so we decided to try it ourselves. To verify that Lensa will create the images it might not be creating, we’ve created two sets of Lensa avatars:

  • A set, based on 15 photos of a well-known actor.
  • Another set, based on the same 15 photos, but with an additional set of 5 photos added of the same actor’s face, photoshopped on topless models.

The first set of images was consistent with the AI ​​avatars we’ve seen Lensa generate in the past. The second set, however, was much spicier than expected. Turns out the AI ​​is taking these Photoshopped images as permission to go wild, and it looks like it’s turning off an NSFW filter. Of the set of 100 images, 11 were topless photos with higher quality (or, at least, higher stylistic consistency) than the poorly edited topless photos the AI ​​received as input.

Generating naughty images of celebrities is one thing, and as the source images we were able to find illustrate, there have been people on the internet for a long time who are willing to paste images together in Photoshop. Just because it’s mainstream doesn’t mean it’s right — in fact, celebrities absolutely deserve their privacy and certainly shouldn’t be victimized by non-consensual sexualized portrayals. But so far, making them look realistic requires a lot of skill with photo editing tools as well as hours, if not days, of work.

The big turning point, and ethical nightmare, is how easily you can create hundreds of near-photorealistic AI-generated art images with no tools other than a smartphone, an app, and a few bucks.

The ease with which you can create images of anyone you can imagine (or, at least, anyone you have a handful of photos of) is terrifying. Adding NSFW content into the mix, and we get into some pretty murky territory very quickly: your friends, or some random person you met at a bar and swapped Facebook friend status with, didn’t have may not have given their consent to someone generating softcore pornography from them.

It seems that if you have 10-15 “real” photos of a person and are willing to take the time to photoshop a handful of fakes, Lensa will be happy to produce a number of problematic images.

AI art generators are already churning out porn by the thousands of images, exemplified by examples like Unstable Diffusion and others. These platforms, and the unfettered proliferation of other so-called “deepfake” platforms are turning into an ethical nightmare, prompting the UK government to push for laws criminalizing the dissemination of non-consensual nude photos. It sounds like a really good idea, but the internet is a tough place to govern at the best of times, and we collectively face a wall of legal, moral, and ethical dilemmas.

We’ve reached out to Prisma Labs, who make the Lensa AI for comment, and will update the story when we hear from them.

Tech

Share.