Skip to main content

Meet the 9 Wikipedia bots that make the world’s largest encyclopedia possible

The idea behind Wikipedia is, let’s face it, crazy. An online encyclopedia full of verifiable information, ideally with minimal bias, that can be freely edited by anyone with an internet connection is a ridiculous idea that was never going to work. Yet somehow it has.

Nineteen years old this month (it was launched in January 2001, the same month President George W. Bush took office), Wikipedia’s promise of a collaborative encyclopedia has, today, resulted in a resource consisting of more than 40 million articles in 300 different languages, catering to an audience of 500 million monthly users. The English language Wikipedia alone adds some 572 new articles per day.

For anyone who has ever browsed the comments section on a YouTube video, the fact that Wikipedia’s utopian vision of crowdsourced collaboration has been even remotely successful is kind of mind-boggling. It’s a towering achievement, showing how humans from around the globe can come together to create something that, despite its flaws, is still impressively great.

What do we have to thank for the fact that this human-centric dream of collective knowledge works? Well, as it turns out, the answer is bots. Lots and lots of bots.

Bots to the rescue

Bots emerged on Wikipedia out of necessity. The term, used as shorthand for “software robot,” is an automated tool designed to carry out specific tasks. In the early days of Wikipedia, this largely involved sorting out vandalism. This problem could be handled manually when the total number of active contributors on Wikipedia numbered in the dozens or even hundreds. But as the website experienced its first boom in popularity this was no longer so easy to do. By 2007, for example, Wikipedia was receiving upward of 180 edits every minute. That was way too much for human editors to cope with.

“A very important thing that [Wikipedia bots were created to do] is to protect against vandalism,” Dr. Jeff Nickerson, a professor at the Stevens Institute of Technology in Hoboken, N.J., who has studied Wikipedia bots, told Digital Trends. “There’s a lot of instances where someone goes into a Wikipedia page and defaces it. It’s like graffiti. That became very annoying for the people who maintain those pages to have to go in by hand and revert the edits. So one logical kind of protection [was] to have a bot that can detect these attacks.”

Along with other researchers from the Stevens Institute of Technology, Nickerson recently carried out the first comprehensive analysis of all 1,601 of Wikipedia’s bots. According to that study, published in the Proceedings of the ACM on Human-Computer Interaction journal, bots account for around 10% of all activity on Wikipedia. This rises to a massive 88 percent of activity on Wikidata, the central storage platform for structured data used on the various Wikimedia websites.

Wikipedia Bot Roles and Associated Functions

Generator

Generate redirect pages

Generate pages based on other sources

Fixer

Fix links

Fix content

Fix files

Fix parameters in template/category/infobox

Connector

Connect Wikipedia with other wikis

Connect Wikipedia with other sites

Tagger

Tag article status

Tag article assessment

Tag Wikiprojects

Tag multimedia status

Clerk

Update statistics

Document user data

Update maintenance pages

Deliver article alert

Archiver

Archive content

Clean up sandbox

Protector

Identify policy violations

Identify spam

Identify vandals

Advisor

Provide suggestions for Wikiprojects

Provide suggestions for users

Greeting the newcomers

Notifier

Send user notifications

The research conducted by Nickerson and colleagues divided bot activity on Wikipedia into nine different categories. There are, as noted, “protectors,” dedicated to identifying policy violations, spam, and vandals. Then there are “fixers,” who live virtual lives revolving around the fixing of links, content, files, and anything else in need of a good tweaking. There are “taggers,” for tagging article statuses and assessments; “clerks,” for updating statistics and maintenance pages; “archivers” for archiving content; “advisors” for greeting newcomers and providing suggestions for users; “notifiers” for sending user notifications; and “generators” for creating redirection pages or generating new content based on other sources.

“Their complexity varies a lot,” said Morten Warncke-Wang, the current controller of SuggestBot, a bot which, well, suggests articles for editors to edit, based on their previous edit history. “It depends on the task that they’re sent to carry out.”

A certain degree of autonomy

Nickerson agreed. A bot, he suggested, can be anything from a relatively simple algorithm to a more complex machine learning A.I. What they have in common, he said, is a degree of autonomy. A bot is something that is created and then deployed to act on its orders, a little bit like a mission objective delegated to an employee. “[A bot] can go off and make hundreds, thousands, sometimes millions of edits on its own,” Nickerson said. “This is not something that [a human editor is] just running once while you’re sitting there.” The 24 tops bots on Wikipedia have made more than 1 million edits in their lifetime: far in excess of virtually every human editor on the website.

If the range of bot categories sounds, frankly, a bit like a medieval colony of monks — all pursuing the unified goal of dogmatic enlightenment through an assortment of seemingly menial tasks — you’re not entirely wrong. The fact that the bot world is reminiscent of a community of sorts is not at all accidental.

Anyone can develop a bot, just like anyone can edit an article.

Despite the fact that most casual Wikipedia users will never interact with a bot, their creation is every bit as collaborative as anything on the Wikipedia front end. Bots are not implemented by Wikimedia in a top-down manner. Anyone can develop a bot, just like anyone can edit an article. They do this according to perceived problems they believe a bot might be able to assist with. To get their bot rubber-stamped, they must submit an approval request to BAG, the Bot Approvals Group. If BAG deems the bot to be a valuable addition to the collective, it will be approved for a short trial period to ensure that it operates as designed. Only after this will it be unleashed on Wikipedia as a whole.

“There’s a prosocial nature to a lot of the editors on Wikipedia,” Nickerson said. “A lot of the time people might write these bots for themselves and then make it available to the community. That’s often the way these bots emerge. Some editor’s doing a task they realize could be fixed with a fairly simple bot. They’ve got the skill to build it, and then that bot gets deployed and used by everyone.”

Like an algorithmic “bring your dog to work day,” the owner of each bot is responsible for its behavior. Fail to respond to behavioral concerns, and your bot will be revoked.

Make bots great again

Here in 2020, bots frequently have a popular reputation that’s somewhere between venereal disease and John Wilkes Booth. They are frequently cast as the human job-replacing, election-swaying tools designed to do far more bad than good. The Wikipedia example shows the flip-side to this picture. Wikipedia’s bots are the site’s immune system: near-invisible tools that help provide resistance to (metaphorical) infection and toxins, while strengthening the system in the process.

As Nickerson points out, however, the bots are not entirely invisible. And that’s to their betterment. “When people don’t think they’ve received a good recommendation, they’ll regularly post about that on the bot page,” he said, describing the “advisers” and “notifiers” intended to coax human contributors to do better. “To me, that’s very interesting. I’d love to be able to affect news feeds I get [elsewhere], but I can’t. I don’t have a way of going to the companies that are selecting news for me and saying, ‘I think you’re giving me too much of this; I’d rather get more of that.’ Having control over the algorithms that are communicating with you is an important thing. And it seems to really work with Wikipedia.”

Image used with permission by copyright holder

Some Wiki bots carry out simple text generation. The first-ever Wikipedia bot, which appeared in late 2002, was designed to add and maintain pages for every U.S. county and city. But both Nickerson and Morten Warncke-Wang, the man behind SuggestBot, said that they couldn’t foresee Wikipedia ever handing control of the website over entirely to text-generating algorithms. “They’re rarely used to create the content,” Warncke-Wang said. “They’re much more used as tools to manage the content development.”

At its core, Wikipedia is a deeply human effort — and the bots are there to help, not hinder. As Manfred E. Clynes and Nathan S. Kline, the two researchers who coined the term “cyborg” wrote in an influential 1960 essay: “The purpose of the [ideal collaboration between humans and machine] is to provide an organizational system in which such robot-like problems are taken care of automatically and unconsciously; leaving man free to explore, to create, to think, and to feel.”

Wikipedia bots follow in that spirit. As long as that relationship continues, long may they carry on helping us find the information we want. And stop bad actors from defacing the pages of celebrities they don’t like.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
This AI cloned my voice using just three minutes of audio
acapela group voice cloning ad

There's a scene in Mission Impossible 3 that you might recall. In it, our hero Ethan Hunt (Tom Cruise) tackles the movie's villain, holds him at gunpoint, and forces him to read a bizarre series of sentences aloud.

"The pleasure of Busby's company is what I most enjoy," he reluctantly reads. "He put a tack on Miss Yancy's chair, and she called him a horrible boy. At the end of the month, he was flinging two kittens across the width of the room ..."

Read more
Digital Trends’ Top Tech of CES 2023 Awards
Best of CES 2023 Awards Our Top Tech from the Show Feature

Let there be no doubt: CES isn’t just alive in 2023; it’s thriving. Take one glance at the taxi gridlock outside the Las Vegas Convention Center and it’s evident that two quiet COVID years didn’t kill the world’s desire for an overcrowded in-person tech extravaganza -- they just built up a ravenous demand.

From VR to AI, eVTOLs and QD-OLED, the acronyms were flying and fresh technologies populated every corner of the show floor, and even the parking lot. So naturally, we poked, prodded, and tried on everything we could. They weren’t all revolutionary. But they didn’t have to be. We’ve watched enough waves of “game-changing” technologies that never quite arrive to know that sometimes it’s the little tweaks that really count.

Read more
Digital Trends’ Tech For Change CES 2023 Awards
Digital Trends CES 2023 Tech For Change Award Winners Feature

CES is more than just a neon-drenched show-and-tell session for the world’s biggest tech manufacturers. More and more, it’s also a place where companies showcase innovations that could truly make the world a better place — and at CES 2023, this type of tech was on full display. We saw everything from accessibility-minded PS5 controllers to pedal-powered smart desks. But of all the amazing innovations on display this year, these three impressed us the most:

Samsung's Relumino Mode
Across the globe, roughly 300 million people suffer from moderate to severe vision loss, and generally speaking, most TVs don’t take that into account. So in an effort to make television more accessible and enjoyable for those millions of people suffering from impaired vision, Samsung is adding a new picture mode to many of its new TVs.
[CES 2023] Relumino Mode: Innovation for every need | Samsung
Relumino Mode, as it’s called, works by adding a bunch of different visual filters to the picture simultaneously. Outlines of people and objects on screen are highlighted, the contrast and brightness of the overall picture are cranked up, and extra sharpness is applied to everything. The resulting video would likely look strange to people with normal vision, but for folks with low vision, it should look clearer and closer to "normal" than it otherwise would.
Excitingly, since Relumino Mode is ultimately just a clever software trick, this technology could theoretically be pushed out via a software update and installed on millions of existing Samsung TVs -- not just new and recently purchased ones.

Read more