Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

Stefan Krause, Germany, FAL, via Wikimedia Commons

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

Machine Learning/Artificial Intelligence

Meet “ClueBot NG”, an AI Tool to tackle Wikipedia vandalism

There are many bots on Wikipedia, computer-controlled  “user accounts” that perform simple, repetitive, maintenance-related tasks. Most are simple, trained to fix typos or using a list of blacklisted words to determine vandalism. ClueBot NG uses a combination of different detection methods which use machine learning at their core.

Bots on Wikipedia

A bot (a common nickname for a software robot) is an automated tool that carries out repetitive and mundane tasks. Bots are used to maintain different Wikimedia projects across language versions. Bots are able to make edits very rapidly, but can disrupt Wikipedia if they are incorrectly designed or operated. False positives are an issue as well. For these reasons, a bot policy has been developed.There are currently 2,534 bot tasks approved for use on the English Wikipedia; however, not all approved tasks involve actively carrying out edits. Bots will leave messages on user talk pages if the action that the bot has carried out is of interest to that editor. There are 323 bots flagged with the “bot” flag right now (and over 400 former bots) on English Wikipedia. On Bulgarian Wikipedia, a much smaller language version, there are currently 106 bot accounts, but only a number of them are active. Projects by smaller communities sometimes need to rely more on machines for page maintenance.

Read More »Meet “ClueBot NG”, an AI Tool to tackle Wikipedia vandalism

Wikimedia Projects & AI: Designing a “Section Recommendation” tool without reinforcing biases

There is an idea to use a  “section recommendation” feature to help editors write articles by suggesting possible sections to be added. But it is possible that its recommendations inadvertently increase gender bias. Here’s how we could deal with it.

Read More »Wikimedia Projects & AI: Designing a “Section Recommendation” tool without reinforcing biases

Wikimedia Projects & AI Tools: Vandalism Detection

There is a machine learning service available to interested Wikimedia projects and communities called ORES. It aims to recognise if an edit, for instance on Wikipedia, is damaging or done in good faith. Of course, false predictions cannot be avoided and thus remain a major risk. Here’s how we try to handle it.  

Read More »Wikimedia Projects & AI Tools: Vandalism Detection