The good and the bad of Lensa’s AI portraits

The good and the bad of Lensa’s AI portraits

Lensa is an AI-powered photo editor app that has been downloaded over 500,000 times. App stores around the globe have risen to the top. Although it’s been around since 2018, Magic Avatars made it a global social media success. You’ve probably seen its AI-generated images in various styles if you’ve been on Twitter or Instagram in the past few weeks.

Lensa relies upon stable diffusion (which We’ve covered this beforeTo create its Magic Avatars, you can use the following link: Lensa creates a customized version of Stable Diffusion’s picture generation model for each user by uploading between 10-20 headshots using the iOS or Android apps. Lensa can create dozens of images using an AI model that is personalized to her needs. It’s capable of doing it just enough to be impressive. Magic Avatars can only be purchased in packs of 50, 100, or 200 for $3.99 and $5.99 respectively.

Lensa’s Magic Avatars don’t come without artifacts. Artificial intelligence models can produce images that look more like abstract art or monsters than a person. It is more likely that the shapes of eyes, fingers, or other small details will be imperfect than the position of a person’s nose or mouth.

And Like most AI-generatorsLensa’s creations don’t have any biases regarding gender, race, or other factors. In an article The Cut Mia Mercado, who is half Filipina and half white, wrote “Why do all my AI avatars have huge boobs?”

[Related:[Related:OpenAI and Shutterstock have found a solution to the AI art ownership problem.]

Writing for MIT Technology ReviewMelissa Heikkila, who is also of Asian heritage, calls her avatars “cartoonishly pornified.” She generated 100 portraits, 16 of which were topless, and 14 of which had Lensa “in extremely skimpy clothing and overtly sexualized poses.” Stable Diffusion is also used by other AI image generators. Some incredibly questionable photos of people of color.

This issue is so common that In an FAQ on the websitePrisma Labs, which is behind Lensa, had to answer the question: “Why do female users tend towards a more sexualized look?” The short answer was: “Occasional sexualization can be observed across all genders, though in different ways.”

According to the FAQ, the problem can been traced back at the initial dataset Stable Diffusion was trained on. It uses the Laoin-5B datasetThe site contains nearly 6 billion unfiltered image/text pairs, gathered from all over the internet. Stable Diffusion’s makers, Stability AI, has openly acknowledged this.The model could reproduce certain societal biases and produce unsafe content“This includes sexualized images of women as well as generic, stereotypical, or racist images of people with color.

Both Prisma and Stability AI claim that they have taken steps to improve their respective systems. Reduce the number of NSFW outputsThese AI models are black boxes, which means that human programmers may not be able to fully understand all the associations the model is making. Other than creating an image database that is bias-free to train an AI model from, there will always be some societal biases in the outputs of AI generators.

That is assuming everyone is acting in good faith. TechCrunch Was Lensa allows you to create NSFW images of a well-known actor.. They uploaded both authentic SFW images of the actor as well as photoshopped images of his face on a topless model. The 100 images were “topless photos with higher quality (or at least with a higher stylistic consistency than the poorly edited topless photos AI was given as input” but this is against Lensa’s terms of service.

These AI generators have the most promising feature: They are improving at an incredible rate. It is obvious that marginalized groups are seeing societal biases in their outputs, but if these models continue evolving and if the developers remain as open to feedback, then there is reason for optimism that they can do more that just reflect the worst of the internet.

Continue reading