Dimi Dimitrov

Update on Net Neutrality in the EU

Net Neutrality in the EU seemed like a topic of the past. Something we dealt with, secured and could turn our attention to other issues now. Two significant recent developments show that it remains a dynamic policy field and that we mustn’t forget about it. After all, we want an information infrastructure that allows all users to have equal access not only to Wikipedia and its sister projects, but also to all the citations and sources.  

Bad news from the Commission

Very large telecoms companies have wanted to make very large online platforms pay for network use for a while. Now they seem to have found a like-minded EU Commissioner in the face of Margrethe Vestager. The argument of the Danish politician is a modern classic for the EU. It boils down to the fact that very large platforms are responsible for a bulk of the internet traffic, but according to telecoms companies are not paying their fair share to fund the infrastructure. 

Read More »Update on Net Neutrality in the EU

DSA: Political Deal done!

European Union (EU) lawmakers have agreed on a political deal to establish general online content moderation rules. Several cornerstones include a notice-and-action regime, Commission oversight over very large platforms and certain rules for terms of services.

After the political deal, some technical wording remains to be worked on. The deal is expected to be voted on in Parliament in July 2022. We have previously compared the three stances from a free knowledge point of view. We also analysed the state of negotiations in April 2022. Here is an analysis of the trilogue deal, based on what we know. 

We welcome that during the deliberations  lawmakers began making a distinction between rules created and imposed by the services provider and rules written and applied by volunteer editing communities. It is a pity that “citizen moderation”, something the internet needs more of, wasn’t recognised explicitly.

Read More »DSA: Political Deal done!

DSA: Trilogues Update

European Union (EU) lawmakers are under a lot of self-imposed pressure to reach an agreement on content moderation rules that will apply to all platforms. Several cornerstones have been placed either at the highest political levels (e.g., banning targeted ads directed at minors) or agreed upon on a technical level (e.g., notice-and-action procedures). But there is still no breakthrough on a few other articles, like the newly floated “crisis response mechanism.”  

Read More »DSA: Trilogues Update

Data Act: A small step for databases, an even smaller step for the EU

Today, the European Commission has leaked its proposal for a “Data Act”, a piece of legislation that is supposed to include a revision of the Database Directive and the sui generis right for the creators of databases (SGR) it establishes. 

Read More »Data Act: A small step for databases, an even smaller step for the EU

DSA: Parliament adopts position on EU Content Moderation Rules

Yesterday the European Parliament adopted its negotiation position on the EU’s new content moderation rules, the so-called Digital Services Act. The version of the text prepared by the Committee on Internal Market and Consumer Protection (IMCO) was mostly adopted, but a few amendments were added. 

Read More »DSA: Parliament adopts position on EU Content Moderation Rules

The EU’s New Content Moderation Rules & Community Driven Platforms

The EU is working on universal rules on content moderation, the Digital Services Act (DSA). Its co-legislators, the European Parliament (EP) and the Council, have adopted their respective negotiating positions in breakneck time by Brussels standards. Next, they will negotiate a final version with each other.   
While the EP’s plenary vote on the DSA is up in January and amendments are still possible, most changes parliamentarians agreed upon will stay. We therefore feel that this is a good moment to look at what both houses are proposing and how it may reflect on community-driven projects like Wikipedia, Wikimedia Commons and Wikidata.

Read More »The EU’s New Content Moderation Rules & Community Driven Platforms

Editorial: The DSA debate after Haugen and before the trilogues

If the EU really wants to revamp the online world, it should start shaping legislation with the platform models in mind it likes to support, instead of just going after the ones it dislikes.

Whistleblowers are important. They often provide evidence and usually carry conversations forward. They might be able to open the debate to new audiences. I am grateful to  Frances Haugen for having the courage to speak and the energy to do it over and over again across countries, as the discussion is indeed global. 

On the other hand the hearings didn’t reveal anything completely new, we didn’t learn something we didn’t already know. We live in a time where the peer-to-peer internet has essentially been replaced by a network of platforms, which, in their overwhelming majority, are for-profit, data-collecting and indispensable in everyday life. 

Read More »Editorial: The DSA debate after Haugen and before the trilogues

Meet “ClueBot NG”, an AI Tool to tackle Wikipedia vandalism

There are many bots on Wikipedia, computer-controlled  “user accounts” that perform simple, repetitive, maintenance-related tasks. Most are simple, trained to fix typos or using a list of blacklisted words to determine vandalism. ClueBot NG uses a combination of different detection methods which use machine learning at their core.

Bots on Wikipedia

A bot (a common nickname for a software robot) is an automated tool that carries out repetitive and mundane tasks. Bots are used to maintain different Wikimedia projects across language versions. Bots are able to make edits very rapidly, but can disrupt Wikipedia if they are incorrectly designed or operated. False positives are an issue as well. For these reasons, a bot policy has been developed.There are currently 2,534 bot tasks approved for use on the English Wikipedia; however, not all approved tasks involve actively carrying out edits. Bots will leave messages on user talk pages if the action that the bot has carried out is of interest to that editor. There are 323 bots flagged with the “bot” flag right now (and over 400 former bots) on English Wikipedia. On Bulgarian Wikipedia, a much smaller language version, there are currently 106 bot accounts, but only a number of them are active. Projects by smaller communities sometimes need to rely more on machines for page maintenance.

Read More »Meet “ClueBot NG”, an AI Tool to tackle Wikipedia vandalism

Wikimedia Projects & AI: Designing a “Section Recommendation” tool without reinforcing biases

There is an idea to use a  “section recommendation” feature to help editors write articles by suggesting possible sections to be added. But it is possible that its recommendations inadvertently increase gender bias. Here’s how we could deal with it.

Read More »Wikimedia Projects & AI: Designing a “Section Recommendation” tool without reinforcing biases

Wikimedia Projects & AI Tools: Vandalism Detection

There is a machine learning service available to interested Wikimedia projects and communities called ORES. It aims to recognise if an edit, for instance on Wikipedia, is damaging or done in good faith. Of course, false predictions cannot be avoided and thus remain a major risk. Here’s how we try to handle it.  

Read More »Wikimedia Projects & AI Tools: Vandalism Detection