Advertisement
News

Instagram Ads Send This Nudify Site 90 Percent of Its Traffic

A service for creating AI-generated nude images of real people is running circles around Meta’s moderation efforts.
Instagram Ads Send This Nudify Site 90 Percent of Its Traffic
Photo by Eyestetix Studio / Unsplash

An AI app for creating nonconsensual nude images of anyone is getting the vast majority of its traffic directly from Meta platforms, where the app is buying thousands of explicit ads featuring nonconsensual nudity of celebrities and influencers. The blatant and repeated violation of Meta’s policies over the course of months is making a mockery of the company’s ability or willingness to moderate a known bad actor that at the moment appears to get the majority of its users by paying Meta directly for ads.

The app, known as Crushmate or Crush AI, has been buying ads on Facebook, Instagram, and other Meta platforms since at least early September. As first reported by Alexios Mantzarlis in his Faked Up newsletter, according to internet traffic analysis firm Similarweb, three of the domains Crush uses had around 240,000 visitors combined, with 90 percent of that traffic coming from Facebook or Instagram.

I’ve seen Meta remove some of these ads since September, but at the time of writing the same three domains that were advertised on Meta platforms and redirected to Crushmate’s services had around 350 active ads and more than 5,000 ads overall. 

Most of the recent ads use the same format. They take a video a woman posted to Instagram or TikTok and show how a user can pause the video on any frame and create a nude image of her. Many of the ads, which are still active, do this to videos of the extremely popular OnlyFans creator Sophie Rain, who made headlines recently for making $43 million in one year on OnlyFans. As Mantzarlis points out, one ad nudifies Mikayla Demaiter, a model with 3.2 million followers on Instagram. Rain and Demaiter did not respond to a request for comment.

Other ads feature other real women I wasn’t able to identify and AI generated women with their clothes being “erased” by the app. 

In early September, a 404 Media reader also tipped me that Crushmate was advertising its services on Facebook Marketplace.

A marketplace ad for Crushmate

I’ve confirmed that all these ads lead to the same Crushmate service that will create nonconsensual nude images and offers some of its services via a subscription plan. 

Promotional copy from Crushmate's site.

I’ve recently reported about Meta running ads that feature explicit nudity, including dozens of ads that are just close up images of vaginas. I’ve also reported repeatedly about “nudify” apps buying ads on Meta platforms. When we’ve flagged these ads to Meta in the past, they removed them. Meta has also removed associated Facebook pages that are buying the ads, but Crushmate has found an easy workaround that is clearly paying off: It creates multiple Facebook pages with AI-generated profile images that look like normal people, then buys ads promoting new, different URLs that redirect to to Crushmate. 

Meta did not respond to specific questions about why it’s not detecting and removing the offending ads for featuring nonconsensual nudity. As I reported last week, extensive testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, found that nudity uploaded to Instagram and Facebook as a normal user was promptly removed for violating Meta’s Community Standards. The same exact visuals were not removed when they were uploaded as ads, showing that Meta has a different standard for enforcement when it’s getting paid to push images in front of users. 

“Meta prohibits ads that promote adult sexual exploitation. We have removed the violating content, enforced against violating urls, and have taken action against the associated accounts and users,” a Facebook spokesperson told me in a statement. “This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content.” 

💡
Do you know anything else about Crushmate? I would love to hear from you. Using a non-work device, you can message me securely on Signal at ‪emanuel.404‬. Otherwise, send me an email at emanuel@404media.co.

Meta removed the ads promoting the three Crushmate domains after Mantzarlis flagged them to the company. Around 230 of the same ads promoting a fourth Crushmate domain Mantzarlis found after reaching out for comment are still live on Meta’s platforms. 

As we’ve previously reported, these nudify apps are some of the most harmful applications of generative AI because they make it so easy to create nonconsensual images of anyone. In the last two years, we’ve seen several examples of these apps being used by minors to create images of other minors. Last year, a survey found that 1 in 10 minors reported that their friends or classmates have used AI tools to generate nudes of other kids. As the Crushmate ads show, minors don’t need to go to the dark corners of the web in search of these tools. Meta is getting paid to popularize them.

Advertisement