Advertisement
AI Slop

Hurricane Helene and the ‘Fuck It’ Era of AI-Generated Slop

The era of politically-motivated AI slop is here and it sucks.
Hurricane Helene and the ‘Fuck It’ Era of AI-Generated Slop

By now, if you have spent any amount of time reading about the devastation wrought by Hurricane Helene, it is likely that you have seen the AI-generated image above.

This is the exact same type of AI-generated slop that has gone viral time and time again over the last year on Facebook and other platforms and that I have written about numerous times. Something very disheartening is happening with this particular image, however. A specific segment of the people who have seen and understand that it is AI-generated simply do not care that it is not real and that it did not happen. To them, the image captures a vibe that is useful to them politically. In this case, the image is being used to further the idea that Joe Biden, Kamala Harris, and FEMA have messed up the hurricane response, leading to little girls suffering in a flood with their puppies. 

Here is Amy Kremer, a National Committeewoman for the Republican National Convention and cofounder of Women for Trump, saying that she knows the image is fake but simply does not care:

“Y’all, I don’t know where this photo came from and honestly, it doesn’t even matter. It is seared into my mind forever,” she posted. “There are people going through much worse than what is shown in this pic. So I’m leaving it because it is emblematic of the trauma and pain people are living through right now.” Kremer’s sentiment is one that was first expressed in a now-deleted tweet that had millions of views. That tweet contained a text message screenshot in which the person’s relative texted them the image of the girl holding a dog crying. The tweeter responded with “that’s AI,” and the person’s relative responded with something to the effect of “I don’t care.” 

Similar ragebait was posted by Trumpworld weirdos Laura Loomer, Buzz Patterson, and Juanita Broaddrick. 

This is the “fuck it” era of AI slop and of political messaging, where AI-generated images are used to convey whatever partisan message suits the moment regardless of truth. For all of the fearmongering about the potential of deepfake-driven disinformation, we have seen time and time again that whatever message is being delivered doesn’t need to be supported with real—or even realistic—photos or videos to convey it. 

We have been careening toward this reality for the better part of a decade, ever since Donald Trump and the right co-opted the term “fake news” and through mainstream media’s endless platforming of various liars and grifters, the tut-tutting of an entire industry of insufferable both-sidesing “fact checkers,” self-serious misinformation reporters, and the complete abdication of any responsibility or interest from Silicon Valley AI maxers in understanding or limiting how their plagiarism machines are used on the internet. 

This image lives alongside the insane, incorrect, and cynical yet incredibly viral idea that FEMA is conspiring to seize people’s property, or that “they” are controlling the weather to attack red states and counties. This image and these narratives have been weaponized by a party that broadly denies human-driven climate change, has repeatedly voted to hamstring FEMA, and is currently running a man for president who, when in office, tossed paper towels to Puerto Ricans devastated by a natural disaster, withheld aid from wildfire victims because they lived in Democratic California, and invented a random path for a hurricane at a news conference with a sharpie. 

I have written more about AI slop than most journalists and have closely studied the incentives and communities where these sorts of images are made and how they spread. In this case, the earliest version of the image I can find was posted by someone called I-am_Orlando on Patriots.win, which is the Reddit clone that Trump supporters defected to after Reddit banned r/the_donald several years ago. The image didn’t get that much attention there, and most of the comments were “Fake.” “Obviously AI,” “Is this AI?” and “Don’t forget, there are very stupid, gullible people on our side as well.” 

That doesn’t mean the image was originally created by people on Patriots.win, though. The very nature of AI-generated slop makes any specific image very difficult to reverse image search or track down the original source. But knowing what I know about how this type of content is made and how it spreads, the source of any individual image is not that important. It could have been made by a Trump supporter on Patriots.win or it could have been made by an Indian teenager, or a member of AI boosters Facebook group in Malaysia, or some random shitposter. What matters is that it and images like it have gone incredibly viral over and over again, that AI slop more broadly is poisoning the internet and is destroying the last shred of any sense of reality or reliability of social media platforms and search engines. The new, terrible reality we have to deal with is that AI slop is mixed in with real images on everything from disaster response and mushroom foraging to searches for baby peacocks and famous artworks

I have documented, at length, that a lot of AI slop is being generated by people all over the world for the express purpose of going viral and making money. While writing about Facebook’s AI slop, I spoke to an Indian teenager who lives in a small village and, unsatisfied with the virality of more mundane AI-generated images he had created, told me he was going to get into making AI content about American news so he could make more money. I have also seen politically-driven AI images essentially being reverse-engineered to go viral in Facebook groups for AI creators. The “All Eyes on Rafah” AI image shared by tens of millions of people on Instagram was created in a Facebook group for people who want to make the “AI industry prosper,” and, following its popularity, hundreds of people generated thousands of AI images about the suffering of Palestinians in Gaza in hopes of having another viral hit. We have also seen AI-generated spam about natural disasters in Brazil, which similarly featured crying, suffering people being overwhelmed by a flood. The most enduring images from the absurd, racist Haitians-are-eating-dogs-and-cats-in-Springfield fiasco are blurry photos that are not actually from Springfield and AI-generated images of Donald Trump hugging cats and geese. 

This should go without saying, but yet I feel like I must say it: The devastation wrought by hurricane Helene is a terrible tragedy, and people are suffering. It increasingly looks like Florida is going to be hit directly by hurricane Milton, one of the strongest storms in decades. This too, will be a horrible tragedy. All of these are real, terrible events impacting real people, and making AI-generated depictions of their suffering is cruel, offensive, and bad. 

And yet, if you point out how stupid, wasteful, and corrosive these specific uses of AI are, you are painted as a loser killjoy by the investors and people who have the most to gain from its mass adoption, and the companies who make and promote the tools that enable it either ignore that it’s happening, publish some bullshit blog post that means nothing about how they are trying oh so hard to combat abuse of their systems, ask you what the big deal is, and continue to ensure that the swift destruction of the human-led, useful and usable internet continues unabated.  

This is happening in part because we live in an era where the truth essentially does not matter, at least in terms of social media virality, and where the truth is often an actual hindrance in conveying whatever might be best for your side politically. But it is also happening because the world’s biggest, most powerful companies and its richest people have collectively decided that the mass adoption of generative AI will be beneficial to their companies’ bottom lines and that it does not matter how people use it or what they use it for, so long as we use their mass plagiarism devices for anything at all. 

Advertisement