- The Lensa app creates face-changing effects using machine learning and photos uploaded by users.
- Some users have received images of themselves portrayed in the nude thanks to AI-generated edits.
The trending Lensa app — currently the top photo app in the Apple and Google Play stores — generates artistic edits based on user-uploaded reference photos, but its machine-learning technology appears to be creating unintentional nudes of its users.
"Ok so I put my hottest 20 pics into lensa instead of just the first 20 selfies I could find & it came back with a bunch of ai-generated nudes," one user wrote on Twitter. "To be clear, NONE of the photos I submitted included nudity, which the app specifically prohibits!"
That sentiment was echoed by dozens of others, mostly women, saying the app had automatically generated sexualized or outright nude photos of them, despite avoiding not-safe-for-work reference photos in their uploads.
While Lensa parent company Prisma Lab's CEO and co-founder Andrey Usoltsev told TechCrunch such images "can't be produced accidentally" by the app, he said it could be provoked to create nude images through "intentional misconduct," such as uploading nudes against the terms of service (which prohibit uploading content that is "obscene, pornographic, indecent, lewd, suggestive" or otherwise sexualized).
Though it is unclear how often the app generates nude imagery without prompting, multiple users report this was the case for them.
"Strange thing is I didn't submit any nudes since it would go against this Lensa app's policy yet it ended up generating nudes anyway???" another user posted on Twitter.
Of particular concern among some users are whether the app somehow accessed photos from internal storage that hadn't been uploaded and if the app's privacy policy allows data generated by the app to be used by third-party companies like Google Cloud Platform and Amazon Web Services.
"Lensa users: Did you receive a highly sexualized image in your avatar package?" one troubled user wrote on Twitter. "I received a topless, full-frontal nudity image in my package, and I'm concerned. I'm worried about whether the app accessed other images on my phone and about the rights to that image."
Usoltsev told TechCrunch the tech being used to generate the photo edits is learning as it goes and — though it has some content moderation practices — can still be outsmarted by users or act in unpredictable ways, resulting in the output of nude edits.
"We specify that the product is not intended for minors and warn users about the potential content. We also abstain from using such images in our promotional materials," Usoltsev told TechCrunch. "To enhance the work of Lensa, we are in the process of building the NSFW filter. It will effectively blur any images detected as such. It will remain at the user's sole discretion if they wish to open or save such imagery."
Representatives for Prisma Labs did not immediately respond to Insider's request for comment.