Subscribe
I generated a bunch of AI images of beloved fictional characters doing 9/11, and I’m not the only one.
Microsoft’s Bing Image Creator, produced by one of the most brand-conscious companies in the world, is heavily filtered: images of real humans aren’t allowed, along with a long list of scenarios and themes like violence, terrorism, and hate speech. It launched in March, and since then, users have been putting it through its paces. That people have found a way to easily produce images of Kirby, Mickey Mouse or Spongebob Squarepants doing 9/11 with Microsoft’s heavily restricted tools shows that even the most well-resourced companies in the world are still struggling to navigate issues of moderation and copyrighted material around generative AI.