Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

Stefan Krause, Germany, FAL, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

How the DSA can help Wikipedia – or at least not hurt it

The Digital Services Act is probably the most consequential dossier of the current EU legislative term.  It will most likely become a formative set of rules on content moderation for the internet. It also means that it will shape the way Wikipedia and its sister projects operate. One can only hope that the DSA doesn’t try to fix what isn’t broken, specifically our community-based content moderation model. What are the scenarios?

A quick history of recent platform liability legislation

One of the reasons why the DSA became a thing, is the growing conviction that online intermediaries – from social media, through various user-generated content hosting platforms, to online marketplaces – will not fix the problems with illegal content through voluntary actions. In the previous legislative term we saw two proposals to change the responsibilities and liability of platforms. The focus was on types of content: copyrighted material (in the infamous Directive in Copyright in the Digital Single Market) and so-called terrorist content (in the Regulation on Dissemination of Terrorist content Online, or TERREG, with its final vote on April 28). 

The topical focus has its limitations, such as the number of legal regimes one platform would need to conform to simultaneously. This time around, the European Commission wants to impose rules on platforms that would cover all sorts of an intermediaries, content and services. 

Supercharged terms and conditions

Understandably, the legislator cannot prevent every possible infringement of the law. It also certainly not can’t predict future technological developments. Also, as private entities, service providers are free to chose what type of content and interaction they want to have on their platforms. 

In order to give users clarity, they have a requirement to spell out any specific rules that they want to apply. Similarly to TERREG, the DSA proposal mandates that restrictions, procedures and tools are to be laid out in terms and conditions and then enforced.

“Some decisions must belong to the editors, not the lawyers.”

Article 12 of the proposed DSA in particular specifies that service providers shall enforce their terms of service in a diligent, objective and proportionate manner. Overall that is a commonsense approach. Rules such as proportionality and diligence are well known in legislation and jurisprudence. However the term “objective” is not, which makes the entire article somewhat vague. 

To give an example: On Wikipedia we have rules that require users to practice “civility” and to “assume good faith”. It would be very hard to define how good faith is assumed across cultures and languages in an objective way. But applying these rules in a “non-arbitrary” manner is much more feasible. This is why this term would match our practices better. 

We have 99 problems but community moderation ain’t one

Well, at least not one that law can fix. The specificity of our project lies in the fact that there are many rules not created and imposed by the service provider (the Wikimedia Foundation)  that make Wikipedia what it is. Rules like “neutral point of view”, “notability of living persons” or “precise and consistent language, layout, and formatting”. 

These community rules do pose restrictions on how Wikipedia is shaped, for better and for worse. And while there are painful and shameful discussions about which women scientists deserve an article, such rules are the core of how we operate and evolve. They have little to do with the legally required terms and conditions. So where would they sit, legally speaking, according to the DSA? 

If they do become a part of terms or services that the service provider needs to enforce “objectively”, it means that they become enforceable under threat of liability. This is something that would not help the editing communities thrive and its users to access reliable information. Some decisions must belong to the editors, not the lawyers. 

“The issues the DSA tackles can’t be solved by platforms and government alone, they require a functioning online society.”

Empower communities!

Wikipedia would really benefit from a legally outlined distinction between rules the platform operator introduces and rules that the community sets to tailor out their experience within the existing legal framework. It may turn out that opening up this possibility would also benefit people in other circumstances, such as when they want to step up and take responsibility on other services. It could engage users in actively shaping their online community life. 

The issues the DSA tackles can’t be solved by platforms and government alone, they require a functioning online society. This is why we would like to see communities more empowered online.