A Bot Has Turned Photos of Thousands of Women into Fake Nudes

Buzz Staff
·2-min read

False nude images of more than 100,000 women have been shared online, according to a report by intelligence company Sensity.

Clothes of the women have been digitally removed from photos taken from social media using Artificial Intelligence (AI) and then circulated online on Telegram, as per a report by BBC.

According to Sensity, the technology used is a deepfake bot and that many of the women whose photos have been leaked are underage. For those unaware, "deepfakes" are computer generated and replace real photos and videos with someone else's face. They are usually extremely realistic in nature which makes it all the more perilous.

Sensity's chief executive Giorgio Patrini said that while deepfakes have often being used to create pornographic videos, using private images of unsuspecting individuals for fake nudes is relatively new as a concept. He also warned that anyone on social media with a public account (that is, your photos are visible) can be an easy target.

This is how the AI-powered bot works. Users can send the photos of a woman to the bot and it will create a fake nude, or basically remove the clothes, in just a few minutes. BBC said that it even tried out the bot with the consent of the users and found that the photos weren't remotely realistic.

The administrator running the service, however, has denied that the photos can be used for violence or blackmailing because they are unrealistic. The bot has been advertised on Russian social media site VK and that most of its users were from either Russia or ex-USSR countries.

In 2019, Vice had reported about a similar disturbing app named DeepNude which could "undress" women in seconds. One could just send a photo to the app and for $50 the app would give back a seemingly naked photo of the person. Unlike the bot on Telegram, these photos were highly realistic and it did not work on men. The article by Vice caused such outrage online that the makers of the app eventually had to take it down.