Taylor Swift is one of the most deepfaked women in the world.
Last week, AI generated images of Swift in hardcore sexual scenarios went viral on Twitter, gathering millions of views, reposts, and likes. A lot has happened in just a few days, including Microsoft’s CEO going on NBC News, the White House saying it’s “alarmed,” and Twitter pledging moderation changes.
But this started six years ago, with the invention of consumer level AI-generated face swapping. At the time, making AI images of Swift required thousands of pictures of her face, powerful computing hardware, and a lot of time and patience. Now, it’s being done with free, easy to use online tools developed by tech giants, or open source software supported by large communities that will gladly teach anyone how to produce whatever image you can imagine.
The reaction to the Swift images might seem like we have reached a tipping point, but as reporters who have been covering this issue since before the term “deepfake” was even coined, this is our assessment: The Taylor Swift images, nonconsensual images of other celebrities, and nonconsensual images of non-public people are not going to stop until something far worse happens.