Senator and Ranking Member of the Senate Judiciary Committee Dick Durbin has sent a letter to Meta CEO Mark Zuckerberg asking about his company’s role in directing traffic to apps that use artificial intelligence to generate nonconsensual nude images. Specifically, Durbin’s letter cites 404 Media’s reporting about Crushmate, a so-called “nudify” app that has repeatedly advertised its services on Meta platforms, often with nonconsensual nude images of women, and that according to traffic analysis firm Similarweb has sent Crushmate 90 percent of its traffic. Alexios Mantzarlis was first to track the traffic Meta was sending Crush in his Faked Up newsletter.
“Tech companies should not assist malevolent actors who seek to take advantage of women and children,” Durbin wrote in his letter to Zuckerberg. “I am gravely concerned with Meta’s failure to prevent this perverse abuse of its platforms and I refuse to accept Meta’s facilitation of these crimes. I therefore urge Meta to join us in combatting this threat.”
Durbin asked that Zuckerberg respond to the following questions no later than March 11, 2025:
- What safeguards has Meta put in place to prevent advertisements on its platforms for Crush and similar apps that encourage users to create nonconsensual deepfake intimate imagery?
- How does Meta ensure that advertiser profiles are legitimate?
- What safeguards does Meta have in place to identify and remove fake advertiser profiles on its platforms?
- What steps does Meta take to ensure advertisements on its platforms do not redirect users to otherwise prohibited products or advertisers?
- What is Meta doing to educate the public, and youth in particular, about the harms of nonconsensual deepfake intimate imagery?
As our previous reporting has shown, and as Durbin’s letter points out as well, the Crushmate ads are in clear violation of Meta’s own policies. We’ve seen Meta take action against some of these ads before, but Crushmate and other nudify ads have found seemingly easy strategies to bypass Meta’s enforcement, even when it does take action. When Meta detects one of these ads, it sometimes removes not only the ad but the account that bought the ad, and attempts to block other ads that promote the same URL. What Crushmate has done in response is simply create new accounts that promote different URLs that redirect traffic to Crushmate.
“This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content,” Meta told 404 Media in response to our article about Crushmate advertising on its platform.
We don’t know why Meta hasn’t removed all of these ads based on the content of the ads themselves, which are not subtle. Many of these thousands of ads, which we’ve spotted on Facebook, Instagram, Threads, and even Facebook Marketplace, often feature explicit, nonconsensual nudity of hugely popular Instagram models. In January, extensive testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, found that nudity uploaded to Instagram and Facebook as a normal user was promptly removed for violating Meta’s Community Standards. The same exact visuals were not removed when they were uploaded as ads, showing that Meta has a different standard for enforcement when it’s getting paid to push images in front of users.
As our reporting has previously shown, and as Durbin letter states, these nudify ads are some of the most harmful implementations of AI that currently exist.
“Because this easily used software is now so readily accessible through platforms like Facebook, Instagram, and Threads, middle schools and high schools around the country are grappling with shocking acts of image-based abuse committed by students on other students,” Durbin wrote in his letter. “These images may be used to harass victims and damage their employment, education, or reputation, or to further criminal activity such as extortion and stalking. In the worst cases, they drive victims to suicide.”