Advertisement
News

Congress Pushes Apple to Remove Deepfake Apps After 404 Media Investigation

Congress is asking Apple, Google, Microsoft, and other big tech companies why they can’t get their deepfake problem under control.
Congress Pushes Apple to Remove Deepfake Apps After 404 Media Investigation
Photo by surasak_ch / Unsplash

A bipartisan group of members of Congress has sent letters to Google’s and Apple’s CEOs citing 404 Media’s reporting and asking what the giant tech companies are doing to address the rampant problem of nonconsensual AI-generated intimate media enabled on their platforms. The Congress members also sent a letter to Microsoft CEO Satya Nadella given Microsoft Designer’s role in creating the infamous nonconsensual nude images of Taylor Swift that were viewed millions of times on Twitter, a story 404 Media broke as well

“Earlier this year, Apple removed three apps used to create deepfakes off of its app store after an independent report by 404 Media provided links to the apps and their related ads,” the letter to Apple CEO Tim Cook said. “While it is positive that these apps were removed, it is concerning that Apple was not able to identify the apps on their own. The App Store requires developers to undergo a screening process, but the persistence of these apps illustrate that loopholes exist in Apple’s guidelines. As Apple works to address these loopholes, we would like to understand what steps are being taken, and what additional guidelines may need to be put in place to curb the spread of deepfake pornography.”

The letter, which was signed by 26 Republican and Democratic House Representatives was sent to Cook on November 25, and is referencing 404 Media story from April about Apple removing a number of face swapping apps which were explicitly advertising their ability to create nonconsensual porn. Apple removed those apps after we published a story earlier in April about those ads appearing on Instagram

“As Congress works to keep up with shifts in technology, Republicans and Democrats will continue to ensure that online platforms do their part to collaborate with lawmakers and protect users from potential abuse,” the letter says, and then presents Cook with a series of questions like “What plans are in place to proactively address the proliferation of deepfake pornography on your platform, and what is the timeline of deployment for those measures?”

A separate letter sent to Google CEO Sundar Pichai by the same members of Congress about Google’s role in allowing apps to advertise their ability to create nonconsensual deepfake porn in Google Search. 

“Earlier this year, Google announced it would ban advertisements for websites and services that produce deepfake pornography,” the letter says, referring to a Google ad policy change we covered in May. “As you know, the emergence of deepfakes has resulted in an increase in ads for programs that cater to users who wish to produce sexually explicit content. While Google’s updated policy instructs AI app developers to build in precautions against offensive content, adds in-app flagging and reporting mechanisms for users, and devalues deepfake porn results in internal search. However, despite these efforts, recent reports have highlighted that Google continues to promote results for apps that use AI to generate nonconsensual nude images. This development raises concerns about Google’s complicity and role in the proliferation of deepfakes. We would like to further clarify the outcome of these updates and understand what additional guidelines may need to be put in place to curb the spread of deepfake pornography, including efforts to remove deepfake platforms from Google’s search results.”

The letter cites a story we published in August which showed that searching for “undress apps,” “best deepfake nudes,” and similar terms on Google turns up “promoted” results for apps that use AI to produce nonconsensual nude images. Google removed those advertisers in response to our story. 

That letter also goes on to ask what plans Google is putting in place to practically address this problem. 

The same members of congress also sent letters to the CEOs of Facebook, TikTok, and Snapchat regarding nonconsensual content on their platforms as well. 

As I wrote in August, face swapping apps present Google and Apple with a very difficult “dual use” problem where the apps can present themselves as benign on the app stores, but promote their ability to produce harmful content off platform. There are more measures these tech companies can put in place to mitigate the problem, but fundamentally any face swapping app has the potential to create harmful content and has to be moderated closely in order to prevent that harm. Monitoring the huge number of apps that are added to these app stores on any given day would be a major and new investment from both Google and Apple.

Advertisement