Advertisement
News

Apple’s Huge “Dual Use” Face Swap App Problem Is Not Going Away

Maybe Apple should ban face swapping apps entirely.
Apple’s Huge “Dual Use” Face Swap App Problem Is Not Going Away
Photo by Sumudu Mohottige / Unsplash

Apple continues to struggle with the “dual use” nature of face swapping apps, which appear harmless on its App Store, but advertise their ability to generate nonconsensual AI-generated intimate images (NCII) on platforms Apple doesn’t control. Apple does not appear to have any plan to deal with this problem despite the widely reported harms these apps cause, mostly to young women, and has repeatedly declined to answer questions about it. 

Last week, I was scrolling Reddit when I stumbled upon an ad for a face swapping app. It seemed innocent enough at first, but I know enough about this space to know that the kind of face swapping apps that spend money to promote themselves often have a more insidious purpose, which allow for the creation of nonconsensual pornographic deepfakes and charge a high price for that service in order to make the ad spend worth it. 

The video ad showed young women dancing and a user selecting videos and images from their camera roll in order to create face swaps. The ad also highlights the app’s ability to pull videos directly from a variety of online sources, including YouTube. The ad doesn’t mention Pornhub by name, but says users can use videos “even from your favorite website” and shows a site that looks exactly like Pornhub, including its iconic black and orange logo, redesigned to say “Web site” instead of “Pornhub.”

Advertisement