Advertisement
News

Deepfake YouTube Ads of Celebrities Promise to Get You ‘Rock Hard’

Deepfakes of Arnold Schwarzenegger, Sylvester Stallone, Mike Tyson, and Terry Crews are selling erectile dysfunction supplements on YouTube.
Deepfake YouTube Ads of Celebrities Promise to Get You ‘Rock Hard’

YouTube is running hundreds of ads featuring deepfaked celebrities like Arnold Schwarzenegger and Sylvester Stallone hawking supplements that promise to help men with erectile dysfunction. 

The ads, which were discovered by Alexios Mantzarlis in the Faked Up newsletter, have been running since at least November 12 and have around 300 variations according to Google’s Ad Transparency Center. All the ads use existing videos that are modified with an AI-generated voice and lip synced to match what the AI-generated voice is saying. Many of the ads feature non-celebrity women who talk about how their “husbands went wild” after “trying a secret simple mix” to treat their erectile dysfunction, but some of the ads feature deepfakes of celebrities including Arnold Schwarzenegger, Sylvester Stallone, Mike Tyson, and Terry Crews. 

“Have you heard about the salt trick that is making me stay hard for hours in bed?” an AI-generated Schwarzenegger asks in his instantly recognizable Austrian accent. “Top adult actors have been using this for the last five years to stay rock hard. I mean, you didn’t think they last that long without a little hack, right?”

0:00
/1:28

Video ads of Stallone, Tyson, and Crews repeating the exact same script indicate that whoever wrote the ads copy/pasted it into an AI voice generator.

0:00
/1:25
0:00
/1:38
0:00
/1:02

The ads lead users to a page on “thrivewithcuriosity.com,” where after confirming they are “40+” years old, they are shown a meandering and very explicit 40-minute long presentation about the miracle drug that is getting men including celebrities, strippers, and adult performers “rock hard.” 

That video opens with a real Today Show interview Stallone did with his wife and three daughters to promote their reality show “The Family Stallone,” but it’s been very uncomfortably edited with AI-generated audio and lip sync to make it seem as if he’s talking about how hard he can get now to satisfy his wife thanks to the miracle drug. 

💡
Have you seen other deepfake celebrity ads on YouTube? I would love to hear from you. Send me an email at emanuel@404media.co.

The video takes viewers on a bizarre journey from a strip club in Texas to a fake Harvard urologist’s office to an abandoned church in Thailand where scientists discovered a species of bat with abnormally large and long-lasting erections. Along the way, deepfake videos of everyone from Tom Hanks, Denzel Washington, and adult entertainment star Johnny Sins are made to say they have been quietly using this secret formula to last longer in bed. The video eventually concludes by offering viewers the opportunity to buy six bottles or 180 days-worth of Prolong Power at $49 per bottle. 

That link sends users to a page on digistore24.com where they can enter their credit card information to purchase Prolong Power, but I was able to find the supplement for sale many other places online. Many sellers on Amazon offer Prolong Power, where it has mixed reviews from users, with some saying “This product is a scam,” “don’t bother,” and “fake.” According to its label, Prolong Power is made up of a “proprietary blend” of oat bran powder, fennel seed, cascara sagrada bark powderact, and other common ingredients that according to the National Library of Medicine are mostly helpful with constipation. Notably, the ingredients do not include “midnight beetle powder,” which the long video pitching Prolong Power explains is the secret ingredient that gave the church bats their magnificent erections. 

 

Prolongpowers.com, which calls it the “#1 Natural Male Enhancement Supplement” claims it now offers a “new version” it calls Primor Dial Vigor X, and features testimonials from three customers who made “verified purchases.” However, a spokesperson for deepfake detection company Reality Defender said that according to their platform, the headshots attached to those testimonials were 99 percent likely to be AI-generated.

Back in January, YouTube deleted around 1,000 similar ads in which deepfaked celebrities unknowingly pitch scams.

“We are constantly working to enhance our enforcement systems in order to stay ahead of the latest trends and scam tactics, and ensure that we can respond to emerging threats quickly,” Google said at the time after deleting the ads. But obviously it still doesn’t have this problem fully under control.

"We prohibit ads that falsely claim a celebrity endorsement in order to scam people," a Google spokesperson told 404 Media in response to this story. "When we identify an advertiser engaging in this deceptive practice, we permanently suspend their account and remove all their ads from our platforms, as we have done in this case. We continue to heavily invest in our ability to detect and remove these kinds of scam ads and the bad actors behind them.”

Google said it removed the deepfake supplement ads and permanently suspended the account that paid for them after I reached out for comment.

Advertisement