This is Source Notes, a column about the internet’s information ecosystem.
Last month, the Jewish-American news site Forward reported a shocking scoop: The Heritage Foundation, the conservative think tank behind Project 2025, is planning to “identify and target” Wikipedia editors. Through analyzing text patterns, usernames, and technical data and employing social-engineering tactics, Heritage aimed to reveal the identities of anonymous Wikipedia editors it believes are “abusing their position” on the platform.
In the culture of Wikipedia editing, it is common for individuals to use pseudonyms to protect their privacy and avoid personal threats. Revealing an editor’s personal information without their consent, a practice known as doxing, is a form of harassment that can result in a user’s being permanently banned from the site. Although this behavior is strictly prohibited by Wikipedia’s rules, Heritage has endorsed these scorched-earth tactics in response to what it perceives as antisemitism among certain editors covering the Israeli–Palestinian conflict on Wikipedia.
Let’s be clear: Wikipedia’s handling of this topic area is incredibly contentious. Many Wikipedians deliberately avoid pages like “Gaza War,” “Zionism,” and even the meta-entry on Wikipedia’s own coverage of the Israeli–Palestinian conflict. These pages are under extended confirmed protection, meaning that only experienced editors—those who have been on Wikipedia for at least 30 days and have made at least 500 edits—can make changes to them.
But even with these restrictions in place, tensions continue to run high. One side accuses the Tech for Palestine coalition of trying to hijack Wikipedia with Palestinian propaganda, while the other points out that Israel’s government seems to be mobilizing its own citizens to write about the conflict from their perspective. Although founder Jimmy Wales insists the Wikipedia community aims for neutrality, the editors don’t always succeed.
Even if you take issue with how the site is currently framing the conflict, that doesn’t justify Heritage’s plan. Targeting Wikipedia editors personally, instead of debating their edits on the platform, marks a dangerous escalation. “The document [from Heritage] is sort of vague about what they would do once they ID a person,” journalist and Wikipedia editor Molly White told Forward, “but the things that come to mind are not great.”
The Heritage Foundation’s threats recall the methods used by pro–Chinese Communist Party editors in 2021, when a group called Wikimedians of Mainland China specifically targeted Hong Kong’s pro-democracy activists. These Chinese nationalist editors were displeased with the way the Hong Kong editors were documenting the protests against Beijing’s rule. Rather than continuing the discussion on Wikipedia’s talk pages (places for editors to chat with one another and debate proposed changes), the pro-CCP editors resorted to doxing and reporting their opponents’ real-life identities to the state police, leading some Hong Kong editors to be physically harmed. It seems that both the CCP and Heritage believe that if you can’t win an argument in the digital space of Wikipedia, it’s fair game to destroy that person’s life offline.
In the documents obtained by Forward, Heritage employees announced plans to use advanced data sources and tools from companies like Moody’s and Thomson Reuters to unmask Wikipedia editors. These powerful applications provide a virtual fire hose of real-time information, including location and address history, cross-referencing usernames, and fingerprinting a user based on writing style.
“Data broker tools can turn doxing into a laser-sharp targeting tool,” Sarah Lamdan, a lawyer and author of Data Cartels, told me.
In the long term, Wikipedians, and the rest of us, can ask for stronger privacy protections from both lawmakers and the companies. Until then, there is not much that users can do to protect themselves from mass surveillance.
Despite the risks, it seems that Wikipedia editors are not fleeing the project in droves. The recent Wikipedia Day NYC gathering at the Brooklyn Public Library boasted an impressive 1,200 RSVPs. Pacita Rudder, the executive director of the local chapter, told me that the group had implemented an emergency and safety manual for volunteers and staff in case of issues. Attendees were given the option to use pseudonyms and could wear red stickers to indicate their preference to have no photos taken of them; a livestream was also available for those who did not feel comfortable attending in person.
Contrary to sensationalist media coverage, decisions made by the Wikipedia community tend to be carefully considered. For example, take the controversy last year about whether articles published by the Anti-Defamation League, a top Jewish civil rights group, should be considered a reliable source for Wikipedia citations. While headlines suggested that Wikipedia had completely banned the ADL, the actual decision makes clear that the organization can still be used as a source in certain contexts outside the Israeli–Palestinian conflict.
For background, Wikipedia has also deprecated other advocacy organizations that take an overtly pro-Russia, pro-China, or pro-Arab perspective. To be fair, the Wikipedia community could do a better job of explaining why advocacy organizations are not always considered reliable sources based on the context; however, that is a complex discussion that’s not easily contained within a tweet.
Meanwhile, the Wikipedia community is trying to govern the behavior of its volunteers without outside influence. For months, the volunteer judges on Wikipedia’s Arbitration Committee (the website’s version of a supreme court) have been scrutinizing the actions of editors who are highly involved in Israeli–Palestinian articles. On Jan. 23, ArbCom issued a verdict in the PIA5 case, a virtual trial in which it examined the conduct of 14 highly prolific editors in this topic area. After hearing preliminary statements from the parties and issuing findings of facts (including that some editors were using deceptive sock puppet accounts), ArbCom ultimately banned multiple pro-Palestinian and pro-Israeli editors for “non-neutral editing.”
In addition to these bans, the committee introduced a new punitive measure, the “balanced editing restriction,” which dictates that sanctioned users can devote only a third of their edits to this contentious topic area. Essentially, these Wikipedians are being forced to broaden their scope. (Already, the verdict has sparked controversy within the Wikipedia community, with some questioning whether these editors will find creative ways to circumvent the rule.)
Regardless of its effectiveness, Wikipedia’s latest decision aligns with its quasi-democratic principles. It reflects a commitment to online debate rather than the authoritarian tactics proposed by Heritage. But if the think tank succeeds in its effort to identify and target editors, the consequences could be profound. Faced with the risk of harassment or real-world retaliation, many volunteer editors—especially those covering politically sensitive topics—may simply stop contributing. Those who remain are likely to be the most ideologically driven voices, further eroding Wikipedia’s stated goal of neutrality.
The free encyclopedia will become too toxic to sustain.