The Washington PostDemocracy Dies in Darkness

A shadowy AI service has transformed thousands of women’s photos into fake nudes: ‘Make fantasy a reality’

More than 100,000 photos of women have had their clothing removed by the software, including of girls younger than 18

October 20, 2020 at 10:28 a.m. EDT
(Andy Wong/AP)

An artificial intelligence service freely available on the Web has been used to transform more than 100,000 women’s images into nude photos without the women’s knowledge or consent, triggering fears of a new wave of damaging “deepfakes” that could be used for harassment or blackmail.

Users of the automated service can anonymously submit a photo of a clothed woman and receive an altered version with the clothing removed. The AI technology, trained on large databases of actual nude photographs, can generate fakes with seemingly lifelike accuracy, matching skin tone and swapping in breasts and genitalia where clothes once were.

The women’s faces remain clearly visible, and no labels are appended to the images to mark them as fake. Some of the original images show girls younger than 18.

The service, which allows people to place new orders through an automated “chatbot” on the encrypted messaging app Telegram, was first discovered by researchers at Sensity, an Amsterdam-based cybersecurity start-up that shared its findings with The Washington Post.

The chatbot and several other affiliated channels have been used by more than 100,000 members worldwide, the researchers found. In an internal poll, the bot’s users said roughly 63 percent of the people they wanted to undress were girls or women they knew from real life.

AI-generated videos that show a person’s face on another’s body are called “deepfakes.” They’re becoming easier to make and weaponize against women. (Video: Drew Harwell, Jhaan Elker/The Washington Post)

Giorgio Patrini, the group’s chief executive, said the chatbot signals a dark shift in how the technology is used, from faking images of celebrities and well-known figures to targeting unsuspecting women far from the public eye.

“The fact is that now every one of us, just by having a social media account and posting photos of ourselves and our lives publicly, we are under threat,” Patrini said in an interview. “Simply having an online persona makes us vulnerable to this kind of attack.”

The chatbot’s growth signals just how quickly the technology behind fake imagery has become ubiquitous.

Ten years ago, creating a similarly convincing fake would have taken advanced photo-editing tools and considerable skill. Even a few years ago, creating a lifelike fake nude using AI technology — such as the “deepfake” porn videos in which female celebrities, journalists and other women have been superimposed into sex scenes — required large amounts of image data and computing resources.

Fake-porn videos are being weaponized to harass and humiliate women: ‘Everybody is a potential target’

But with the chatbot, creating a nude rendering of someone’s body is as easy as sending an image from your phone. The service also assembles all of those newly generated fake nudes into photo galleries that are updated daily; more than 25,000 accounts have already subscribed for daily updates.

The bot’s biggest user base is in Russia, according to internal surveys, though members also originate from the United States and across Europe, Asia and South America.

New users can request some of their first fake nudes for free but are encouraged to pay for further use. A beginners’ rate offers new users 100 fake photos over seven days at a price of 100 Russian rubles, or about $1.29. “Paid premium” members can request fake nude photos be created without a watermark and hidden from the public channel.

The chatbot’s administrator, whom The Post interviewed Monday through messages on Telegram, declined to give their name but defended the tool as a harmless form of sexual voyeurism and said its operators take no responsibility for the women targeted by its user base. As an allusion to its boys-will-be-boys posture, the service’s logos feature a smiling man and a woman being ogled by X-ray glasses.

But technology and legal experts argue that the software is weaponizing women’s own photographs against them, sexualizing women for a faceless group of strangers and presaging a new age of fabricated revenge porn.

Some tech giants have taken a stand against deepfakes and other “manipulated media.” But because the system’s source code has already been widely shared by online copycats, the experts see no clear way to stop similar software from creating, hosting and sharing fake nude images across the unregulated Web.

Some of the targeted women are popular entertainers or social media influencers with sizable audiences. But many of those seen in publicly available photos produced by the bot are from everyday workers, college students and other women, often taken from their selfies or social media accounts on sites like TikTok and Instagram.

Danielle Citron, a Boston University law professor who researches the online erosion of “intimate privacy,” said she has interviewed dozens of women about the experience of having real or manufactured nude images shared online. Many said they felt deep anguish over how their images had been seen and saved by online strangers — and, potentially, their co-workers and classmates.

“You’ve taken my identity and you’ve turned it into porn . . . That feels so visceral, harmful, wrong,” Citron said. “Your body is being taken and undressed without your permission, and there’s documentary evidence of it. . . . Intellectually, [you] know it hasn’t happened. But when [you] see it, it feels as if it has, and you know others won’t always know” it’s fake.

“The vulnerability that creates in how you feel about your safety in the world: Once you rip that from somebody, it’s very hard to take back,” she added.

The bot gives users advice on submitting requests, recommending that the original photos be centered at the women’s breasts and show them in underwear or a swimsuit for best results. But many of the images show women in unrevealing school attire or everyday clothes, like a T-shirt and jeans. At least one woman was pictured in a wedding dress.

One young woman had multiple photos of her submitted to the service, some of which included a fake bikini top crudely inserted on top of her normal clothes — likely an attempt to improve the bot’s performance.

The automated service, however, only works on women: Submit an image of a man — or an inanimate object — and it will be transformed to include breasts and female genitalia. (In one submitted image of a cat’s face, its eyes were replaced with what appeared to be nipples.)

The bot’s administrator, speaking in Russian, told The Post in a private chat on Monday that they didn’t take responsibility for how requesters used the software, which they argued was freely available anyway. “If a person wants to poison another, he’ll do this without us, and he’ll be the one responsible for his actions,” the administrator wrote.

The Sensity researchers counted more than 104,000 images of women altered to appear nude and shared in public channels. A website for the service suggests that number is far higher, with 687,322 “girls nuded” and 83,364 “men enjoyed.” But the administrator said that number was random and used only for advertising, because they do not keep statistics of processed photos.

The bot’s rules say it does not allow nudes to be made of underage girls. But the service’s publicly visible collections feature teenage girls, including a popular TikTok personality who is 16 years old.

The administrator said the system was designed merely to fulfill users’ fantasies and that everyone who would see the images would realize they were fakes.

“You greatly exaggerate the realness,” the administrator said. “Each photo shows a lot of pixels when zoomed-in. All it allows you to do is to make fantasy a reality, visualize and understand that it’s not real.”

The administrator also said the service had not “received a single complaint from a girl during the entire period of our work,” and attempted to shift the blame onto victims of the fakes for posting their images online.

"To work with the neural network, you need a photo in a swimsuit or with a minimum amount of clothing. A girl who puts a photo in a swimsuit on the Internet for everyone to see — for what purpose does (she do) this?” the administrator wrote. “90% of these girls post such photos in order to attract attention, focusing on sexuality.”

Following questions from a Post reporter, however, the administrator said they had disabled the bot’s chat and gallery features because of a “lot of complaints about the content.” The service for creating new images, and previously generated images, remained online.

Representatives for Telegram, which offers end-to-end encryption and private chat functions, did not respond Monday to requests for comment.

How the founder of the Telegram messaging app stood up to the Kremlin — and won

Britt Paris, an assistant professor at Rutgers University who has researched deepfakes, said manipulators have often characterized their work as experimenting with new technology in a lighthearted way. But that defense, she said, conveniently ignores how misogynistic and devastating the images can be.

“These amateur communities online always talk about it in terms of: ‘We’re just . . . playing around with images of naked chicks for fun,’ ” Paris said. “But that glosses over this whole problem that, for the people who are targeted with this, it can disrupt their lives in a lot of really damaging ways.”

The bot was built on open-source “image-to-image translation” software, known as pix2pix, first revealed in 2018 by AI researchers at the University of California at Berkeley. By feeding the system a huge amount of real images, it can recognize visual patterns and, in turn, create its own fakes, transforming photos of landscapes from daytime to night, or into full color from black-and-white.

The software relies on an AI breakthrough known as generative adversarial networks, or GANs, that has exploded in popularity in recent years for its ability to process mounds of data and generate lifelike videos, images and passages of text.

Dating apps need women. Advertisers need diversity. AI companies offer a solution: Fake people

The researchers behind pix2pix celebrated its potential benefits for artists and visual creators. But last year, an anonymous programmer trained the underlying software on thousands of photos of naked women, effectively teaching the system to transform women from clothed to nude.

After the tech blog Motherboard wrote last year about the app, called DeepNude, the developer responded to the online backlash by taking the free-to-download app offline, saying, “The probability that people will misuse it is too high."

The deep-learning pioneer Andrew Ng last year called DeepNude “one of the most disgusting applications of AI,” adding: “To the AI Community: You have superpowers, and what you build matters. Please use your powers on worthy projects that move the world forward.”

But the existence of the chatbot shows how it will be virtually impossible to eradicate the software outright. The original app’s source code has been saved and widely distributed online, including in for-profit websites that offer to generate images in exchange for a small fee.

Top AI researchers race to detect ‘deepfake’ videos: ‘We are outgunned’

Hany Farid, a computer scientist at UC-Berkeley who specializes in digital-image forensics and was not involved in the original pix2pix research, said the fake-nude system also highlights how the male homogeneity of AI research has often left women to deal with its darker side.

AI researchers, he said, have long embraced a naive techno-utopian worldview that is hard to justify anymore, by openly publishing unregulated tools without considering how they could be misused in the real world.

“It’s just another way people have found to weaponize technology against women. Once this stuff gets online, that’s it. Every potential boyfriend or girlfriend, your employer, your family, may end up seeing it,” Farid said. “It’s awful, and women are getting the brunt of it.

“Would a lab not dominated by men have been so cavalier and so careless about the risks?” he added. “Would [AI researchers] be so cavalier if that bad [stuff] was happening to them, as opposed to some woman down the street?”

That problem is already a reality for many women around the world. One woman targeted by the bot, an art student in Russia who asked to remain anonymous because she did not want to get involved with these “stupid people,” had a photo of herself in a tank top taken from her Instagram account and transformed into a fake nude.

In an interview, she compared the fake to someone smearing her name but said she was grateful that enough people knew her to realize it probably wasn’t real.

“The scammers who do this kind of filth will not succeed,” she said. “I believe in karma, and what comes around for them won’t be any cleaner than their own actions."

Isabelle Khurshudyan and Will Englund contributed to this report.