The Wikimedia Foundation is building new tools that it hopes will help Wikipedia editors stay anonymous in part to avoid harassment and legal threats as Elon Musk and the Heritage Foundation ramp up their attacks on people who edit Wikipedia. Some of the tactics have been pioneered by Wikimedia in countries with authoritarian governments and where editing Wikipedia is illegal or extremely dangerous.
Last month, Forward obtained a document created by the Heritage Foundation called “Wikipedia Editor Targeting,” which set a goal to “identify and target Wikipedia editors abusing their position by analyzing text patterns, usernames, and technical data through data breach analysis, fingerprinting, HUMINT (human intelligence), and technical targeting.”
The document discusses creating sock puppet accounts to “reveal patterns and provoke reactions,” discusses trying to track users’ geolocation, searching through hacked datasets for username reuse, and using Pimeyes, a facial recognition software, to learn the real identities of Wikipedia editors. Molly White of Citation Needed has an extensive rundown on Elon Musk’s crusade against Wikipedia, and both Slate and The Atlantic have written about the right’s war on Wikipedia in recent days.
In a series of calls and letters to the Wikimedia community over the last two weeks, Wikimedia executives have told editors that they are trying to figure out how to keep their users safe in an increasingly hostile political environment. “I’m keeping an eye on the rising noise of criticism from Elon Musk and others and I think that’s something we need to grapple with,” Wikimedia founder Jimmy Wales said in a meeting on January 30.
“We’re seeing an increase in threats, both regulation and litigation across the world,” Wikimedia Foundation CEO Maryana Iskander told community members during the same January 30 meeting. “We’re all just trying to understand what is happening not only in the United States [but across the world], so the best we can do is monitor, check-in on staff, and try to understand what’s needed … that’s the most honest answer I can give you to an impossible set of questions we’re all grappling with on a daily basis.”
Wikimedia lawyers told the community that the project is trying to change how editing Wikipedia for logged-out accounts works. Currently, if a user edits an article while not logged in, their IP address will show publicly, which can provide information to someone looking to file a defamation or libel lawsuit. Wikimedia is launching a “temporary accounts program” which will give editors who are not logged in a temporary username rather than showing an IP address. “It’s a way of ensuring that for logged-out users, their IP address isn’t visible to everyone asunder but rather available only to people who are really engaged in anti-vandalism,” Phil Bradley-Schmieg, a Wikimedia lawyer, said.
Bradley-Schmieg also suggested that Wikimedia’s human rights team, which is focused on “helping users stay safe, particularly in countries where freedom of speech and expression is under attack on a regular basis,” may need to play a larger role across the entire project.
Jacob Rogers, another Wikimedia lawyer, said during a separate meeting on January 30 that some Wikimedia projects in non-English languages have a feature where users are allowed to create and register a sock-puppet account (a dummy username, basically) to edit controversial articles and to register that account with administrators.
“A number of the different language projects have the option to make legitimate sock puppet accounts if you’re going to work on something you know is going to be controversial, you can make a sock puppet and register it with admins on that project so it’s more obscure, kept separate from the rest of your life,” Rogers said.
Both Rogers and Bradley-Schmieg said that Wikimedia has worked to limit the amount of data that the foundation has on any given user. IP addresses associated with edits are deleted or anonymized after 90 days, for example.
“The foundation has very little data about most users, so if somebody is stepping up their harassment and coming to the foundation, we generally don’t really know anything about users in most cases and there’s not a lot they can get from us,” Rogers said. In the first six months of 2024, the last period for which data is available, Wikimedia received 26 formal requests for information on users; it provided info in two cases. Six of those requests came from the United States, the most of any jurisdiction.
Wikimedia has also created a legal defense program that will in some cases fund the defense of Wikipedia editors who are attacked through the legal system as long as that editor or staffer was contributing to a Wikimedia project in good faith, Rogers said. Wikimedia has recently fought cases in both India and Germany.
While Musk’s and the Heritage Foundation’s attacks on Wikipedia have escalated in recent days, these general trends are not new, and they were outlined as a threat in the foundation’s 2024 annual plan, which states the following:
“Human rights threats are growing. Physical and legal threats against volunteers and staff who fight disinformation continue to grow. Accusations of bias and inaction by those whose preferred narratives do not prevail on Wikipedia may be encouraged and amplified by purveyors of disinformation,” the foundation wrote in an update to users. “Law is weaponized in important jurisdictions. Bad-faith lawsuits, by people who don’t like the verified information appearing on Wikipedia pages, are succeeding in some European countries. Some incumbent leaders are abusing their powers to silence and intimidate political opponents.”
Iskander said in the meeting that the foundation is going to consider the safety of Wikipedians for its in-person events, such as Wikimania, an annual conference and party.
“We’re paying very close attention trying to understand what the impacts might be and ensure those might be considered in any decisions we make. I will remind folks part of our processes in any event related to community gatherings is to do a risk assessment for community conferences for Wikimania,” she said. “It’s an imperfect and imprecise exercise but there’s a real intentionality around being thoughtful about the places that we’re selecting to ask people to gather and manage within our control.”
It is not clear whether any of these steps will be sufficient, or whether any of them are going to make Wikipedia more resilient to right-wing attacks. What makes Wikipedia so strong is the fact that it has a distributed global base of dedicated volunteer editors, and a governance structure that is not very easy to infiltrate. Wikimedia’s decentralized power base makes it resistant to but not invulnerable from takeover attempts.
During one of the meetings, Rogers was asked if Wikimedia would consider moving its headquarters out of the United States because of the political situation here. Rogers said moving “would probably not do very much because the projects would remain accessible in the United States and many things would still be subject to US law even if the foundation moved its headquarters to a different jurisdiction.”
“I think a move would be extremely expensive and cost something in the tens to hundreds of millions of dollars,” he said. “I see that as one of the most significant, expensive, and extreme possible options. You would only do that if it was like, the only solution to a major problem where doing that would make sense.”
The Wikimedia Foundation did not respond to a request for comment. The Heritage Foundation did not respond to a request for comment.