Women’s bodies, the first target for users of generative AI

2024-03-08 17:03:00

Hyper-sexualized virtual influencers in shambles on Instagram, 13-year-old teenagers undressed by their classmates, content creators continually victims of pornographic montages… Generative AI, capable in particular of creating images from a simple text, has industrialized the objectification of women’s bodies like never before.

Using a few tools accessible without much technical knowledge, free or inexpensive, it is possible to create and disseminate denigrating images. This gives the opportunity to a multitude of individuals and companies to make the lowering of women’s bodies a lucrative business.

AI: tech giants will develop tools to flush out “deepfakes” as several elections approach

« Undress women for free “, we can read on the ClothOff app, launched by a Belarusian brother and sister, as recently revealed The Guardian. The latter made headlines last fall. In Almendralejo, in the south of Spain, around ten 14-year-old schoolgirls found themselves “naked” by the boys in their class using the application.

Using an algorithm, ClothOff creates a naked body from a photo of a clothed woman or girl. This body is in no way his, but the illusion works. A mother of the young girls is sounding the alarm on social networks and creating panic among parents. The same type of incident occurred in American high schools.

Hundreds of thousands of deepfakes porn on the web

The episode raises awareness that gender-based cyberviolence can be used against any woman. If the phenomenon affected actresses as early as 2018 (Scarlett Johansson was one of the first to worry about it, recently it was Taylor Swift who was the victim of this type of photomontage), it has now become widely democratized.

So much so that in the United Kingdom, the law now defines their distribution without consent as a crime. This decision came after the publication of numerous testimonies from young women victims of “deepfake porn”, photos and videos of them of a sexual nature, faked using AI tools. All report the same feeling of disgust, violation, shame, amazement… In France, an amendment to the SREN law (bill aimed at securing and regulating the digital space) added at the end of 2023 registered as an offense in the code criminal “bringing to the attention of the public or a third party” hyperfaking of a sexual nature. They will be liable to 3 years of imprisonment and a fine of 75,000 euros.

Generative AI safeguards are of little use

According to a daily survey The world, their numbers would have exploded considerably. The media uses figures from statistician Geneviève O. The specialist on the subject lists 276,149 videos of deepfake porns accessible on the traditional web in the last quarter of 2023. Among the sites that host them are MrDeepfake, or traditional porn sites, like XVideos and XNXX. The Mind group (Pornhub) claims to ban this type of video. These images also circulate on Discord forums or Telegram conversations, sometimes in exchange for money, via transfers often made in cryptocurrency.

In France, several personalities including the journalist Salomé Saqué, the YouTuber Juju Fitcats and the influencer Lena Situations have alerted to this phenomenon, of which they themselves have been victims. “Thank you AIs. Not only are harassment and humiliation not decreasing on the networks, but they are intensifying”wrote Salomé Saqué on X, a few months ago.

Related Articles:  Astronomers think they know the reason for the axis of Uranus Kooky Off-Kilter: ScienceAlert

Virtual women-objects

In a long investigation, the American site 404 media detailed the “supply chain” of this new industry. It begins in particular on CivitAI, a site where you can download AI models. Some are set to create pornographic content and thus used to create fake images from real people. Even if the site does not allow it in principle. These algorithms are also trained on images of sex workers who also experience use of their bodies without their consent.

“Sex workers and women are already dehumanized,” she said. Featuring a non-human archetype of a woman replacing jobs and satisfying a representation of what women should be to men? This only further fuels the argument that women are not human,” summarizes Fiona Mae, content creator on the OnlyFans platform.

Alongside this phenomenon, another manifests itself, also contributing to the commodification of women’s bodies. This is the emergence on social networks (mainly TikTok and Instagram) hyper sexualized virtual influencers. They too are created using AI tools. Very far from « body positivism » that we encounter elsewhere on these networks, these imaginary women reproduce the same stereotypes endlessly. They all have huge breasts, thin waists, smooth and white skin most of the time, and long hair. Their publications consist mainly of suggestive photos, called thirst trap (photo to stir up desire) on social networks.

New unattainable beauty standards

Behind these profiles, we generally find men. Their creations, which have other men as their main target, are a source of income. Because in addition to profiles accessible for free, these accounts often offer paid content, accessible in particular on Fanvue, a platform similar to OnlyFans, where pornographic content is hosted. Some virtual influencers also offer chat services, like Minitel Rose 3.0, automatically powered by large language models.

To develop these profiles, web entrepreneurs give each other advice via YouTube and Discord. There are a multitude of tutorial videos for creating and monetizing your AI influencer. To obtain more realistic videos, some use models recruited as freelancers via platforms, or use them directly from the accounts of real influencers. They take this basic video and mix it with the image of their cyberdoll.

In addition to using the work of other women to further objectify the female body, these images convey unattainable standards of beauty. In the United States, 12,000 parents signed a petition last January asking TikTok to more clearly label the content of virtual influencers, believing that these perfect bodies and faces perpetuated unreal standards accentuating dysmorphophobia in children and adolescents.