Twitter will remove nonconsensual nude images within hours as long as that media is reported for having violated someone’s copyright. If the same content is reported just as nonconsensual intimate media, Twitter will not remove it within weeks, and might never remove it at all, according to a pre-print study from researchers at the University of Michigan.
The paper, which has yet to be peer reviewed, argues that the difference between how quickly Twitter responds to copyright violation claims and how it can ignore reports of deepfake porn highlights the need for better legislation on nonconsensual content, which will force social media companies to respond to those reports.
This post is for paid members only
Become a paid member for unlimited ad-free access to articles, bonus podcast content, and more.
Subscribe
Sign up for free access to this post
Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Already have an account? Sign in