Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

Stefan Krause, Germany, FAL, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

DSA: Trilogues Update

European Union (EU) lawmakers are under a lot of self-imposed pressure to reach an agreement on content moderation rules that will apply to all platforms. Several cornerstones have been placed either at the highest political levels (e.g., banning targeted ads directed at minors) or agreed upon on a technical level (e.g., notice-and-action procedures). But there is still no breakthrough on a few other articles, like the newly floated “crisis response mechanism.”  

The European Commission published its legislative proposal back in December 2020. The European Council adopted its position in December 2021, while the European Parliament agreed on its version in January 2022. We have previously compared the three stances from a free knowledge point of view. Since January, the three institutions are in semi-formal negotiation procedures called “trilogues”, where they are trying to reach a final compromise. It is time for us to give you an update on the negotiations.

Whose Content Moderation And Rules Are We Talking About?

Online platforms allowing users to post content often have functions that allow these users to set up their own rules and actively moderate certain spaces. This is true for the now classical, but still very popular, online discussion forums, including Reddit groups, fan pages or club bulletin boards. It is especially true for Wikimedia projects, including Wikipedia, where volunteer editors make up the rules and moderate the space. 

With the Digital Services Act (DSA) imposing obligations for content moderation, it would be undesirable to put volunteer citizens who care about a space under the same legal pressures as professionals working full time for a corporation. Hence, we need to make sure that the definitions of “content moderation” and “terms and conditions” reflect that. Currently they both do. 

As of this week, both the Parliament and the Council agree to back the Commission proposal and define “content moderation” within this regulation as “activities, automated or not, undertaken by providers of intermediary services.”

When it comes to “terms and conditions” the two bodies have a slight disagreement. The Parliament position is to add a “by the service provider” clarification to the definition. The Council, however, believes that is already a given in the text, which reads:

(q) ‘terms and conditions’ means all terms and conditions or clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services.

We welcome the fact that legislators and officials are having a conversation about this, with projects such as ours and online forums in mind. 

“Actual knowledge” of Illegal Content

A cornerstone of the DSA is to set up clear and straightforward rules for the interactions between users and providers with regards to content moderation. A notice-and-action mechanism is the first step. Then there are ways for users to contest the decisions—or indecision—of the service providers: internal complaints, out-of-court dispute settlements and, of course, court challenges. 

It was of the utmost importance for Wikimedia to highlight that not every notice the Wikimedia Foundation receives is about illegal content. This is crucial, as “actual knowledge” of illegal content forces action, usually a deletion. The agreed upon new text now includes language explaining that notices imply actual knowledge of illegal content only if a “diligent provider of hosting services can establish the illegality of the relevant activity or information without a detailed legal examination.”

A new addition in the negotiations is that the internal complaint handling mechanism would allow users to complain when platforms decide not to act on breaches of their terms and conditions.

Who Will Regulate and Oversee Wikipedia and its Sister Projects? 

According to the original Commission proposal, each Member State would designate a regulator responsible for enforcing the new rules. A platform would be regulated either where it is established or where it chooses to have a legal representative if the service provider is not located within the EU. During the trilogues, the Council suggested and the Parliament accepted that rules specific to Very Large Online Platforms (VLOP) should be enforced by the Commission. We normally welcome this move, even if it does provide for some inconsistency out of the Wikimedia projects, only Wikipedia is likely to be a VLOP, which means that it alone will be overseen by the Commission, while our other projects will be overseen by national authorities. 

What Will This Cost Wikimedia?

As the idea to have the Commission play the role of a regulator received traction, another line of thought was also suddenly accepted: establishing a fee for VLOPs to pay for the additional Commission staff needed. The idea is for the DSA to give powers to the Commission to impose fees based on a delegated act.  

It took some back and forth, but the final proposal by the Commission is to waive the fee for not-for-profit service providers. 

Crisis Response Mechanism

Sparked by the invasion of Ukraine, the last weeks saw political pressure build up to include a “crisis response mechanism” into the regulation. It would empower the Commission to require that providers of VLOPs apply “specific effective and proportionate measures” during a crisis. A crisis is defined to take place “where extraordinary circumstances lead to a serious threat to public security or public health in the Union.” 

While we understand the need for such a mechanism in principle, we are uncomfortable with its wording. Several key points must be addressed:               

  • Decisions that affect freedom of expression and access to information, in particular in times of crisis, cannot be legitimately taken through executive power alone.
  • The definition of crisis is unclear and broad, giving enormous leeway to the European Commission. 
  • A crisis response must be temporary by nature. The text must include a solid time limit

Targeted Advertising

It looks like the Parliament and the Council will agree to ban targeted advertising to minors as well as using sensitive data (e.g., political and religious beliefs) for targeting. Wikimedia generally supports everyone’s right to edit and share information without being tracked and recorded. 

Waiver for Nonprofits, Maybe?

It is still an open question whether the Council will accept the Parliament’s proposal to include a waiver allowing not-for-profits to be excluded from certain obligations, such as out-of-court-dispute settlement mechanisms. We welcome this, as it would avoid us setting up new mechanisms that could disrupt a largely efficient community content moderation system. But if the definitions of the DSA make it clear that this applies only to service provider decisions, we will not worry too much about it. 

General Monitoring Prohibition

Negotiators are still discussing a compromise that there should be no general obligation to monitor, “through automated or non-automated means,” information transmitted or stored by intermediary services. The Parliament wants to go further and clarify that this also includes “de facto” general monitoring obligations—i.e., rules that compound to general monitoring in practice. The thinking behind this is that sometimes several smaller obligations can lead the providers to a situation where they need to monitor all content. The Council is still pushing back on this. 

We do believe that a ban on general monitoring is crucial to ensure intellectual freedom and support the Parliament’s position on this.

Next Steps

The next technical meeting of advisers and experts is on 19 April 2022. The next political round of negotiations is scheduled for 22 April 2022. Europe needs a set of general content moderation rules, and the DSA is on track to deliver exactly this. We hope that all parts of the regulation will be properly deliberated and proper safeguards will be enshrined. Wikimedia will continue to provide constructive input to lawmakers as well as participate in the public debate.