Listen to the 404 Media Podcast
Advertisement
AI

'Brainrot' AI on Instagram Is Monetizing the Most Fucked Up Things You Can Imagine (and Lots You Can't)

The hottest use of AI right now? Dora the Explorer feet mukbang; Peppa the Pig Skibidi toilet explosion; Steph Curry and LeBron James Ahegao Drakedom threesome.
'Brainrot' AI on Instagram Is Monetizing the Most Fucked Up Things You Can Imagine (and Lots You Can't)

This article contains potentially disturbing graphics and descriptions that are nonetheless viral on Instagram and other major platforms.

These are words I never thought I would type, and the people in my life who I have said them to have told me to immediately stop speaking. But here is how I would describe the type of AI generated reels that are popular on Instagram right now: Dora the Explorer feet mukbang; Peppa the Pig Skibidi toilet explosion; Steph Curry and LeBron James Ahegao Drakedom threesome; LeBron James and Diddy raping Steph Curry in prison; anthropomorphic fried egg strippers; iPhone case made of human skin; any number of sexualized Disney princesses doing anything you can imagine and lots of things you can’t; mermaids making out with fish; demon monster eating a woman’s head; face-swapped AI adult influencers with Down syndrome, and, unfortunately, this. Unfortunately, I swear to you that the screengrabs and videos I am including and linking to in this article are not the worst that I have seen on Instagram.

Other “niches” that have become popular on Instagram and which have begun to regularly pop up on my feed are wildly racist AI videos of Black men whose faces are put on dogs or gorillas, Black men storming KFC restaurants and chasing after watermelon, George Floyd opening a “Fent-Donalds,” Martin Luther King Jr. in a tub of green sludge, Anne Frank as a zionist cyborg, etc.

As I wrote last week, the strategy with these types of posts is to make a human linger on them long enough to say to themselves “what the fuck,” or to be so horrified as to comment “what the fuck,” or send it to a friend saying “what the fuck,” all of which are signals to the algorithm that it should boost this type of content but are decidedly not signals that the average person actually wants to see this type of thing. The type of content that I am seeing right now makes “Elsagate,” the YouTube scandal in which disturbing videos were targeted to kids and resulted in various YouTube reforms, look quaint. 

Advertisement

Join our free newsletter

404 Media is an independent, journalist-founded tech news site dedicated to bringing you unparalleled access to hidden worlds both online and IRL. Subscribe to our newsletter for updates on our new investigations, articles, and podcasts.

Great! Check your inbox and click the link.
Sorry, something went wrong. Please try again.
CTA Image