DSA

DSA in imco: Three amendments we like and one that surprised us

Just before the summer recess, the European Parliament’s Internal Market and Consumer Protection committee released over 1300 pages of amendments to the EU’s foremost content moderation law. It took the summer to delve into the suggestions and are ready to kick off the new Parliamentary season by sharing some thoughts on them. Our main focus remains on how responsible communities can continue to be in control of online projects like Wikipedia, Wikimedia Commons and Wikidata.

1. The Greens/EFA on “manifestly illegal content”

AM 691 by Alexandra Geese on behalf of the Greens/EFA Group

Article 2 – paragraph 1 – point g a (new)

‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;

Almost any content moderation system will require editors or service providers to assess content and make ad-hoc decisions on whether something is illegal and therefore needs to be removed or not. Of course, things aren’t always black-and-white and sometimes it takes a while to make the right decision, like with leaked images of Putin’s Palace. Other times it is immediately clear that something is an infringement, like a verbatim copy of a hit song, for instance. In order to recognise these differences the DSA rightfully uses the term “manifestly illegal”, but if fails to actually give a definition thereof. We agree with Alexandra Geese and the Greens/EFA Group that the wording of Recital 47 should make it into the definitions. 

Read More »DSA in imco: Three amendments we like and one that surprised us

Takedown Notices and Community Content Moderation: Wikimedia’s Latest Transparency Report

In the second half of 2020 the Wikimedia Foundation received 380 requests for content alteration and takedown. Two were granted. This is because our communities do an outstanding job in moderating the sites. Something the Digital Services Act negotiators should probably have in mind.

See the organisational chart in full here

Wikipedia is a top 10 website globally anyone can edit and upload content to. Its sister projects host millions of files uploaded by users. Yet, all these projects together triggered only 380 notices. How in the world is this possible?

Read More »Takedown Notices and Community Content Moderation: Wikimedia’s Latest Transparency Report

How the DSA can help Wikipedia – or at least not hurt it

The Digital Services Act is probably the most consequential dossier of the current EU legislative term.  It will most likely become a formative set of rules on content moderation for the internet. It also means that it will shape the way Wikipedia and its sister projects operate. One can only hope that the DSA doesn’t try to fix what isn’t broken, specifically our community-based content moderation model. What are the scenarios?

A quick history of recent platform liability legislation

One of the reasons why the DSA became a thing, is the growing conviction that online intermediaries – from social media, through various user-generated content hosting platforms, to online marketplaces – will not fix the problems with illegal content through voluntary actions. In the previous legislative term we saw two proposals to change the responsibilities and liability of platforms. The focus was on types of content: copyrighted material (in the infamous Directive in Copyright in the Digital Single Market) and so-called terrorist content (in the Regulation on Dissemination of Terrorist content Online, or TERREG, with its final vote on April 28). 

The topical focus has its limitations, such as the number of legal regimes one platform would need to conform to simultaneously. This time around, the European Commission wants to impose rules on platforms that would cover all sorts of an intermediaries, content and services. 

Read More »How the DSA can help Wikipedia – or at least not hurt it