Advertisement
News

Hacked ‘AI Girlfriend’ Data Shows Prompts Describing Child Sexual Abuse

A hacked database from AI companion site Muah.ai exposes peoples' particular kinks and fantasies they've asked their bot to engage in. It also shows many of them are trying to use the platform to generate child abuse material.
Hacked ‘AI Girlfriend’ Data Shows Prompts Describing Child Sexual Abuse
A collage of images available on Muah.ai. Collage: 404 Media.

This article contains descriptions of sexual violence and child abuse.

A hacker has targeted a website that lets users create their own “uncensored” AI-powered sexual partners and stolen a massive database of users’ interactions with their chatbots.

The data, taken from a site called Muah.ai and viewed by 404 Media, includes chatbot prompts that reveal users’ sexual fantasies. In many instances, users are trying to create chatbots that roleplay child sexual abuse scenarios. These prompts are in turn linked to email addresses, many of which appear to be personal accounts with users’ real names.

“I went to the site to jerk off (to an *adult* scenario, to be clear) and noticed that it looked like it [the Muah.ai website] was put together pretty poorly,” the hacker told 404 Media. “It's basically a handful of open-source projects duct-taped together. I started poking around and found some vulnerabilities relatively quickly. At the start it was mostly just curiosity but I decided to contact you once I saw what was in the database.”

Sign up for free access to this post

Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Advertisement