Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

Stefan Krause, Germany, FAL, via Wikimedia Commons

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

DSA: Political Deal done!

European Union (EU) lawmakers have agreed on a political deal to establish general online content moderation rules. Several cornerstones include a notice-and-action regime, Commission oversight over very large platforms and certain rules for terms of services.

After the political deal, some technical wording remains to be worked on. The deal is expected to be voted on in Parliament in July 2022. We have previously compared the three stances from a free knowledge point of view. We also analysed the state of negotiations in April 2022. Here is an analysis of the trilogue deal, based on what we know. 

We welcome that during the deliberations  lawmakers began making a distinction between rules created and imposed by the services provider and rules written and applied by volunteer editing communities. It is a pity that “citizen moderation”, something the internet needs more of, wasn’t recognised explicitly.

Further positive safeguards for intellectual freedom online  include a ban on targeted advertising using sensitive information and a ban on “dark patterns”.

We regret that the so-called “crisis mechanism”, a provision allowing the European Commission to ask very large platforms to tackle certain content in times of crisis, came as a last minute addition and was not properly publicly deliberated. Its safeguards remain vague. 

1. Does the Content Moderation Process Recognise User Communities?

We have repeatedly explained during the process that many free knowledge platforms handle content moderation differently. Agreeing on rules and imposing them is something the volunteer editing communities do. And they do it well. Of course, the service provider must interfere in certain cases such as risk to life and limb, illegal content not being removed by community or court orders. 

In order to not impose the same obligations on volunteer editors as we would on professionals working for the platforms, the definitions of certain terms such as “content moderation” and “terms of service” are important. While some details remain unknown, it is apparent that the DSA will focus solely on moderation done by the service provider. This is codified in the Article 2 definitions. Unfortunately the DSA does not formally recognise volunteer editing communities, but at does try to ensure it doesn’t interfere with their work.

Online platforms’ terms of services (regardless what they are called) must be clear and understandable to users. Furthermore, a summary of the main elements must be provided as per Article 12. The same article also obliges platforms to enforce the restrictions that they included in their terms of service, in a proportional and diligent manner. Again, we are talking about obligations on the service providers here, not the editing communities. Importantly, this also doesn’t prevent volunteer editors from moderating online spaces. If they do it well, the service provider might have less work.

2. How does the New content moderation pipeline work?

The principle that an online service provider needs to act upon being informed about illegal content has been well established. However, when and how such action must be taken was a matter of diverging interpretations. We now have a rather straightforward process that can be called a “notices pipeline”. 

The first step is that a user of the platform (regardless whether a person or an organisation) submits a notice to the service provider. The service provider needs to then diligently check the notice whether it really refers to illegal content or content incompatible with the terms of services. Here an important clarification was made in the text to say that not all notices are about illegal content and trigger content moderation. The decision is with the service provider. This is something we have advocated for, as the majority of notices the Wikimedia Foundation receives are not about content incompatible with law. 

The user and the uploader must be informed of the decision taken by the service provider. Both sides will be allowed to contest the decision in an internal complaint handling mechanism that the platform provider has to offer. This is basically a step that ensures the notice can be double-checked. 

If the users are still disputing the result, they might make use of “out-of-court dispute settlement” bodies. These will be organisaitons designated by each Member State. There are limits to how much they might cost to each side, but they will have a fee in order to avoid massive trolling. 

Another safeguard against notice trolling is that service providers are allowed to disregard notices coming from certain notifiers, if they repeatedly provide unfaithful or inadequate information.  

The option to go to court, of course, remains a possibility in the process. 

3. Crisis Response Mechanism

The European legislators have decided to include a “crisis response” mechanism in the final rounds of negotiations, using the war in Ukraine and the Coronavirus pandemic as a reason. It will enable the European Commission to mandate very large platforms, including Wikipedia, to tackle content that risks public health and safety. We understand the logic behind such a provision, but regret that such a wide-reaching executive power was added at the last steps without proper public debate and with very vague language. For instance, it is hard to say what exactly constitutes a crisis under this article. 

On the upside, together with other civil society groups, we successfully advocated for several safeguards. The Commission may only trigger this mechanism if a majority of national regulating bodies (so more than half Member States) agree. The Commission won’t be able to prescribe specific actions to platforms, but rather tell them which content is dangerous and must be moderated. Actions may range from deletion, over limiting visibility, to changing the content. Any such demand by the Commission must be made immediately public and the entire mechanism expires three months after being triggered. 

4. Oversight, Obligations and Costs for Wikipedia and its Sister Projects

The regulatory oversight of the DSA is split between the European Commission and national regulators (called Digital Services Coordinators). The European Commission will be responsible for Very Large Online Platforms (VLOPs) and the specific obligations they have. For Wikimedia this means that Wikipedia will be regulated by the European Commission, while Wikimedia Commons and Wikidata are likely to fall under the competence of a national regulator where the Wikimedia Foundation decides to appoint a “legal representative” for the DSA.

As part of the specific VLOP obligations, the Wikimedia Foundation will perform regular assessments of systemic risks, such as disinformation, availability of illegal content and revenge porn. It will also be required to work out mitigation measures for identified risks, which will be subjected to independent audits. The other obligations in this category simply don’t apply to Wikipedia, as Wikimedia Projects don’t do advertising and don’t apply boosting algorithms to model what every user sees. 

In order to set up a new unit to deal with this, the European Commission will be able to impose a fee on VLOPs. Not-for-profit organisations operating VLOPs are exempt from this fee.

5. Safeguards for Intellectual Freedom

There are several parts of the DSA worth celebrating, although they won’t have direct impact on Wikimedia projects and the final result falls short of what we supported. 

The European Union is banning targeted advertising based on minors’ data and based on sensitive data such as political, sexual or religious preferences. 

The European Union is also banning a practice called “dark patterns”. Those are designs that make it easier for users to accept than to refuse tracking. In future both choices must be equally easy to make. 


After this political deal the lawyers and other experts must work on the technical level for another few weeks scrub the text and clarify all outstanding details. Then The Council and the European Parliament will vote on the consolidated version. The Parliament expects a final vote to take place in July. Once the Regulation is published in the Official Journal of the European Union and comes into force 20 days later. Online platforms will have another 15 months after that date to prepare until the rules start to apply.