In a relatively short space of time, Wikipedia has gone from being a novel idea to the de facto font of online knowledge. It’s readily available, constantly updated, and staggeringly far-reaching — but can it be trusted?
The United States is currently in the grip of a presidential primary, meaning people are looking for impartial information on the parties and politicians involved. Wikipedia will be the go-to source for a great swathe of this knowledge. But who’s writing these entries, and for what purpose? It turns out, both ends of the political spectrum are making their voices heard — a process that can distort the truth.
On October 14 of this year, we saw the first Democratic Presidential Debate of the current cycle unfold — and with it, many voters were given their first real exposure to one of the most divisive candidates in contention. As you might expect, viewers turned to Wikipedia for a quick primer.
The day of the debate saw more than 130,000 visits to the Wikipedia entry detailing Sanders, a tenth of the 1.3 million hits the page received in the past three months. And while thousands of eyeballs scoped out the page, a war over what they would see was fought.
Wikipedia is built upon five self-enforced “pillars,” one of which is the site’s neutral point of view. In the political arena, the boundaries between stating facts and presenting a figure in a particular, preferred light can become blurred very easily.
Consider the edit made by a contributor going under the handle Kendrick7 in the early hours of November 7. The intro to Sanders’ write-up has long mentioned that he participated in the March On Washington For Jobs And Freedom in 1963. Kendrick7 added a few words stating that this event was the setting for Martin Luther King’s “I Have a Dream” speech.
The user states that the edit was meant to “add context,” and one can see why the extra detail is needed — few Americans remember the march by name. However, others might see it as an effort to link Sanders to King at a time when he and other candidates were actively seeking liberal support.
With some four thousand edits in the history of the Sanders entry alone, it can be difficult to understand exactly why individuals are sitting down at a computer to perfect Wikipedia’s presentation of the man himself. The question is, what makes someone get involved?
Who’s Making Changes?
“Quite honestly, I’m not totally sure about what got me started,” says Calidum, one of the many active Wikipedia users to contribute to the Bernie Sanders entry. “It was probably something minor, like fixing vandalism or copy-editing a page. I stuck around and registered an account because I was sort of fascinated with the behind the scenes stuff.”
Calidum’s work on the Sanders entry is minor at best — he’s only responsible for a grand total of 0.13 percent of the changes the page has undergone. However, he’s made similar efforts to perfect entries across Wikipedia, racking up more than 13,000 edits over the last five years.
“I wouldn’t say there is a method, per se,” he told me when I asked the sort of content he most enjoys contributing to. “For the most part, it’s just items of interest to me, like sports or pop culture. I’ve also been involved in ‘In the News,’ an area I found interest in because of my work as a journalist in Greater Boston.”
Like many other Wikipedia contributors, Calidum would rather amend an existing article than start a new entry afresh. While the growth of the site has had major, undoubtedly positive effects on its global reach and the number of users offering help, it’s also made it more difficult to find an untapped topic.
While the community has ways of dealing with vandalism, its ability to handle point-of-view pushers […] is lacking.”
“Creating a decent article from scratch isn’t hard, but it can difficult to find something I’m interested in that doesn’t have an article already,” Calidum tells me. “Most of my work is on already existing articles, whether it be copy-editing, fixing vandalism, or updating the text with new information.”
Vandalism goes hand-in-hand with the idea of an open, user-sourced encyclopedia. While some contributors take pride in creating or adding to a well-researched article, others are more interested in lacing someone else’s work with unwarranted additions.
“The run of the mill vandalism I see everyday isn’t that big of an issue to me, because it’s quickly dealt with,” Calidum replies when questioned about these practices. “Sometimes I even laugh at it before reverting it.” But there’s another kind of edit that’s a more serious concern to upstanding users like him.
“The bigger issue for me, and I would assume many editors, though I can’t speak for them, is people who use Wikipedia to push a certain agenda or engage in battleground behavior. Those are not as frequent as vandalism, but far too common,” Calidum tells me. “And, while the community has ways of dealing with vandalism, its ability to handle point-of-view pushers and similar problematic editors is lacking.”
The Sock Puppets of Grundle
I got in touch with Calidum because of a particular edit, made on October 23, 2015. A user going by the handle “Autoerotic Mummification” added a new section to the Sanders entry; a single sentence stating that the candidate had been accused of hypocrisy by conservatives for the wages paid to his interns.
“Anytime you see a chunk of new text begin with a variation of the phrase ‘group X accused so-and-so of’ it’s a red flag, and generally will be inappropriate per our policies on biographies and related guidelines,” Calidum says when I ask about the edit. He simply reverted the article back to the prior version that had been saved and thought little more of it — but, as it turned out, this particular rabbit-hole went much deeper.
Visiting the Wikipedia user page for Autoerotic Mummification presents a message stating that the individual has been blocked indefinitely, noting that the account is a “sock puppet” for the user Grundle2600. By Wikipedia’s own definition, sockpuppetry is the use of multiple user accounts to mislead editors and distort the consensus.
Anytime you see a chunk of new text begin with a variation of the phrase ‘group X accused so-and-so of’ it’s a red flag.
The investigation into Grundle2600 and his sock puppets shows just how prolific he has been. The first report in the archive was made in April of 2010, but there’s been activity in the ongoing probe ever since, with a colorful list of suspected sock puppet accounts being assessed by community members.
You might think that catching a sock puppeteer in the act would be as simple as cross-referencing an IP address, but user privacy concerns and countermeasures detract from the potency of such a strategy. Instead, investigations need to take advantage of other methodology, like noting preferred stomping grounds for abusers, and comparing edits to confirmed examples of their work. Here, we see a user piecing together evidence against Grundle2600.
As you may well have guessed by now, Grundle2600’s political views tend to the right. As noted by Tarc, the Wikipedia article on Barack Obama’s time as U.S. President has been an enduring favorite for edits, but the account’s list of contributions also includes edits made to an entry on the “Clock Kid” Ahmed Mohamed, and even a defense of HIV medication price-hiker Martin Shkreli.
It’s easy to conjure up the backstory for a Wikipedia contributor like this one. Perhaps, rather than being one individual, it’s the stock account for the rotation of interns working in a campaign office in some distant corner of the country. Perhaps it’s the personal account of Donald Trump.
Fortunately, thanks to the Internet, we can hear directly from Grundle2600. There’s plenty of debate between him and other contributors on Wikipedia itself, and there are other interesting comments to be found on other small forums.
In an October 2009 post on the DVD Talk forum, Grundle2600 states what he perceives as the reasoning behind his ban:
Here’s where you might expect some sort of conservative rant about a liberal agenda, but it never comes. Grundle2600 details, with citations, a bevy of edits that he has made on political topics. As you might expect from someone who’s kept up the habit for almost a decade at this point, he believes that he is in the right.
There are no doubt plenty of trolls who visit Wikipedia solely for the purpose of vandalism. Grundle2600 doesn’t seem to be part of that contingent. He’s clearly making edits that he believes are useful, and isn’t swayed by the Wikipedia community’s point of view.
“The reason that I’m here is that the mission is so important to me,” says James Alexander, the manager of Trust & Safety at the Wikimedia Foundation. “I really believe that knowledge is power.”
“Trust is a nebulous thing,” he continues. “We want to make sure that people can trust that they’ll be safe on the site. The biggest thing is that if someone was going to be harassing on the site, the community would be able to handle that.”
We probably get one or two threats of harm per week.
James is one of 280 paid employees on the books of the Wikimedia Foundation. Their job isn’t to decide what sort of content ends up on the site. Instead, it’s to ensure that the platform provides an open and neutral space for its community, who are then free to shape its output for themselves.
“It’s relatively rare that we need to reach out to the community, they’re very good at taking care of it themselves,” says Juliet Barbara, Senior Communications Manager for the organization. “They have a lot of tools — that they’ve developed — to help keep watch.”
Of course, there are systems in place should things get out of hand. “We probably get one or two threats of harm per week,” James told me. He’s part of a group of staff members who can be contacted at any time about a serious situation, whether he’s at his computer or fast asleep in bed.
But such situations don’t include routine edits. The Wikimedia Foundation is keen to wash its hands of any decision-making that influences what sort of viewpoint is represented on the site. To maintain true neutrality, it’s crucial that the community is trusted to police itself.
“At some levels, [problems] can be taken to the Arbitration Committee,” said Juliet. This group was established in 2003 by Wikipedia co-founder Jimmy Wales, and is trusted with the top-level decision-making power he originally held. Wales retains final say on who is appointed to the position, but an advisory election typically informs his selections.
These measures are in place to extinguish genuine conflicts between well-meaning users, but there are of course other issues that have to be dealt with. “There’s a particular kind of paid editing that called Undisclosed Paid Advocacy,” Juliet told me.
“They’re often quite obvious,” James adds. “It’s about the context.” The fact that it’s so obvious means that it’s no longer a viable option for political parties. “They’re trying to help explain to their candidates and their members the right way to do it,” James continues, without mentioning a particular party by name. “They’re getting better.”
If it’s managed to modify the behavior of political organizations, there’s reason to believe that the self-policing strategy that’s in place is powerful enough to keep Wikipedia’s output relatively unbiased. Using money to influence edits just doesn’t work.
But what about Grundle2600, who’s spent seven years being told that his edits don’t fit the bill? Is he a dedicated editor trying to ensure a balanced view, or a political missionary looking to enforce his agenda? That depends on who you ask — and it’s the reason the war to mold Wikipedia won’t end as long as the site exists.
- YouTube Poop is punk rock for the internet age, and you probably don’t get it
- One of Nat Geo’s first female photographers captured stories others ignored
- The web has grown up, but browsers haven’t. It’s time for a reboot
- Twitter is officially a teenager now. Are we raising a monster?
- Apex Legends proves battle royale is no fad. In fact, it’s just getting started