In an unusual weaponization of content moderation tools, members of hacking and fraud focused Discord servers are deliberately uploading child abuse imagery to have their rivals’ servers shut down, 404 Media has found.
Vendors in the digital underground have sold banning services for sites like Instagram for years. What makes this Discord attack different is that it is much more clearly criminal in nature—both for the person uploading child abuse imagery to a target server, and potentially for someone who downloads it, accidentally or otherwise. Broadly, the Discord attack is a continuation of using content moderation systems that are designed to protect users, but which can be leveraged maliciously to target others.
This post is for paid members only
Become a paid member for unlimited ad-free access to articles, bonus podcast content, and more.
Subscribe
Sign up for free access to this post
Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Already have an account? Sign in