Advertisement
News

Apple’s Huge “Dual Use” Face Swap App Problem Is Not Going Away

Maybe Apple should ban face swapping apps entirely.
Apple’s Huge “Dual Use” Face Swap App Problem Is Not Going Away
Photo by Sumudu Mohottige / Unsplash

Apple continues to struggle with the “dual use” nature of face swapping apps, which appear harmless on its App Store, but advertise their ability to generate nonconsensual AI-generated intimate images (NCII) on platforms Apple doesn’t control. Apple does not appear to have any plan to deal with this problem despite the widely reported harms these apps cause, mostly to young women, and has repeatedly declined to answer questions about it. 

Last week, I was scrolling Reddit when I stumbled upon an ad for a face swapping app. It seemed innocent enough at first, but I know enough about this space to know that the kind of face swapping apps that spend money to promote themselves often have a more insidious purpose, which allow for the creation of nonconsensual pornographic deepfakes and charge a high price for that service in order to make the ad spend worth it. 

The video ad showed young women dancing and a user selecting videos and images from their camera roll in order to create face swaps. The ad also highlights the app’s ability to pull videos directly from a variety of online sources, including YouTube. The ad doesn’t mention Pornhub by name, but says users can use videos “even from your favorite website” and shows a site that looks exactly like Pornhub, including its iconic black and orange logo, redesigned to say “Web site” instead of “Pornhub.”

I tested the app and found that, for users willing to pay a subscription fee of $20 a month, it makes it incredibly easy to generate nonconsensual deepfake porn of anyone. All I had to do was provide one image of the person I wanted to deepfake, and use an in-app internet browser to navigate to the video I wanted them to appear in. As the ad I saw on Reddit suggested, when I navigated to a specific Pornhub video, the app automatically pulled the video and created  deepfake porn of the person in the image I provided. The entire process, from the moment I saw the ad on one of the most popular websites in the world to the completed deepfake video, took about five minutes, and created a highly convincing result. 

It’s a process that I and other reporters are familiar with by now, and what happened next was also predictable. 

First, I contacted Reddit, which removed the ad. 

“The ad violates our policies and has been removed,” a Reddit spokesperson told me in an email, and directed me at Reddit’s ads policy page, which prohibits “Sexually explicit content, products, or services.”

Then, I reached out to Apple, saying I found another face swapping app that was advertising its ability to generate NCII. Apple asked for a link to the app, meaning it wasn’t aware of its existence and was not able to find it on its own. After I provided the link, Apple removed it. 

Apple told me, on background, meaning it didn’t want to be quoted directly, that it created the App Store to be a safe and trusted place for its customers to get apps. It also noted that, as with all apps on the App Store, apps with AI and generative AI features must comply with its App Store Review Guidelines, which forbids content that is “offensive, insensitive, upsetting,” “defamatory,” or “overtly sexual or pornographic.” 

Despite multiple requests, Apple did not engage with my questions about the inherent “dual use” problem with these face swapping apps. 

This is a well established loophole in the mobile app stores—Google has this problem as well—Sam and I first reported on in 2022. These apps make no mention of adult content on their app store pages or on their sites because it’s against the app stores’ policies, but were actively promoting their ability to create nonconsensual deepfake porn on other porn tube sites. In April, I reported on the same exact thing happening with explicit ads on Instagram, which resulted in Instagram removing the ads and Apple removing the specific apps we found. The app may have an innocuous-sounding name like 'Face App,' and on its App Store page it seems like a quirky, innocent app intended to be used for face swapping social media photos with friends. But on porn sites and social media sites, the app advertises itself as being able to produce nonconsensual porn, which of course it can, because there’s no technical difference between generating an innocent face swap video and a pornographic one. The app I found on Reddit makes it extremely easy to make pornographic deepfakes by allowing users to pull videos directly from Pornhub. It also charges for this $20 a month for this ability via an in-app purchase, of which Apple takes a cut. 

🍏
Do you know anything about how Apple screens for nonconsensual deepfake apps? I would love to hear from you. Using a non-work device, you can message me securely on Signal at ‪(609) 678-3204‬. Otherwise, send me an email at emanuel@404media.co.

But the problem is obviously not going to stop. There are too many apps on the Apple App Store and too many ads running through ad networks on Instagram, Reddit, porn tube sites, and elsewhere for these platforms to catch them all. The business model for these platforms works at scale, and that scale doesn’t give time for humans to manually review, investigate, and approve every ad or app. 

In the past, we have seen face swapping apps that refused to generate adult videos, but clearly that is not a guardrail that Apple is requiring of face swapping apps in order to be added to the App Store. There are other red flags which should help Apple find these apps, like the fact that some of them charge so much money for face swaps that can be easily made for free elsewhere online, but it’s not taking action on those clues either. All Apple requires is that the apps don’t violate their policies, which is easy to do on their app store pages and websites, meaning Apple doesn’t know there’s a violation, off platform, until reporters reach out for comment. 

Hany Farid, a professor at UC Berkeley and one of the world’s leading experts on digitally manipulated images, told me he thinks Apple is always going to have this issue as long as it allows face swapping apps on the App Store, and that Apple should “probably” ban them entirely.

“If we look at the use cases of face-swap deepfakes, the vast majority of use cases is to create nonconsensual sexual imagery,” Farid told me in an email. However, he also noted that there are things Apple could try to do to prevent harm caused by face swapping apps.

“An app could mitigate this harmful application if, for example, they only allowed you to create a deepfake of your own face where the app forces you to take a live photo of yourself, and verifies this liveness,” he said. “I’ll point out that Apple bans any apps associated with adult content, so this would not be too far of a deviation from their long-standing policies [...] they [Apple] certainly can’t rely on the creators to simply not explicitly market their apps in this way.”

Earlier this week I reported that a recent survey found that one in 10 minors ages 9 to 17 say that their friends or classmates have used AI tools to generate nudes of other minors. The report goes into some detail about what internet platforms minors are using, where they are sharing nudes, and where they say the are most likely to encounter child sexual abuse images, but it neglects to mention that the high rate of minors who create AI-generated nude images of their peers is not originating on the dark corners of the internet. They are getting served ads on the most popular internet platforms in the world—Instagram, Reddit, TikTok—which are directing them to the most popular app store in the world for the stated purpose of creating NCII. As Jason and I reported in February, the students who rocked a Washington State high school by creating nude images of their classmates and teachers found the app on TikTok. 

Apple will remove the app once we flag it, but refuses to have a substantive discussion about  the problem in general and what it plans to do about it.

Other tools are accessible on other parts of the internet, and it’s extremely unlikely that all these tools can be disappeared from the internet, but that’s not where the harm is coming from. It’s coming from huge internet platforms including Apple, and none of that is going to change until Apple changes how it tackles this problem. 

Advertisement