
Recruiting partners is not the only way the website has sustained itself. Hundreds of links for the website’s referral program—where people receive free image-generation tokens every time someone clicks—are also being shared on Twitter, YouTube, Telegram, and specialized pornographic deepfake forums.
Since the site was launched, its creator—whose identity is unknown and who did not respond to a request for comment—claims to have updated its algorithm multiple times. The website says it is currently running on version 2.0. The site’s developer claims a third version, apparently due to be released at the start of 2022, will improve “prediction” on photographs taken from the “side or back.” The creator claims that future versions will allow people to “manipulate the attribute of target such as breast size, pubic hair.”
The website’s startup-like growth tactics signal a maturity in abusive “nudifying” deepfake technologies, which overwhelmingly target and harm women. Since the first AI-generated fake porn was created by a Redditor at the end of 2017, these systems have become more sophisticated. The technology was turned into its first app, dubbed DeepNude, in 2019; although its creator took the app down, its code still circulates. Since then this kind of technology has become as easy to use as selecting a photo and clicking upload. Recent horrifying developments have also included easy-to-use video production.
With the increased ease of use, targets of harassment have moved from high-profile celebrities and influencers to members of the public. The expansion of this recent site and its partnerships commoditizes those intrusions even further. “The quality is much higher,” says Henry Ajder, an adviser on deepfakes and head of policy and partnerships at synthetic media company Metaphysic. “The people behind it have done something which hasn’t really been done since the original DeepNude tool … that’s trying to build a strong community around it.”
The inclusion of partners and payment services across the website and its two partners indicates that this kind of technology is at a tipping point, says Sophie Maddocks, a researcher at the University of Pennsylvania’s Annenberg School for Communication who specializes in studying online gender-based violence. “This harm is going to become part of the sex industry and is going to become profitable; it’s going to become normalized,” Maddocks says. Society, technology companies, and law enforcement need to have a “zero tolerance” approach to these deepfakes, she adds.
The websites are raking in money for their creators. All three charge people for processing the images, ranging from $10 for 100 photos to $260 for 2,000. They offer a limited number of free images, billed as trials of the technology, but visitors are pushed toward payment. At various points in their existence, they have accepted bank transfers, PayPal, Patreon, and multiple cryptocurrencies. Like Coinbase, many of these providers cut ties after previous media reports. All three sites still accept various cryptocurrencies for payment.
Ivan Bravo, the creator of the spinoff website that claims to have more than 3,000 paying customers, says “it is not correct” morally that he makes money from a service that harms people. But he continues to do so. “It generates good income,” he said in an email when asked why he operates the website. He declined to say how much money he has earned through sales but says “it has been more than enough to support a family in a decent house here in México.”