Just before the summer recess, the European Parliament’s Internal Market and Consumer Protection committee released over 1300 pages of amendments to the EU’s foremost content moderation law. It took the summer to delve into the suggestions and are ready to kick off the new Parliamentary season by sharing some thoughts on them. Our main focus remains on how responsible communities can continue to be in control of online projects like Wikipedia, Wikimedia Commons and Wikidata.
1. The Greens/EFA on “manifestly illegal content”
AM 691 by Alexandra Geese on behalf of the Greens/EFA Group
Article 2 – paragraph 1 – point g a (new)
‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
Almost any content moderation system will require editors or service providers to assess content and make ad-hoc decisions on whether something is illegal and therefore needs to be removed or not. Of course, things aren’t always black-and-white and sometimes it takes a while to make the right decision, like with leaked images of Putin’s Palace. Other times it is immediately clear that something is an infringement, like a verbatim copy of a hit song, for instance. In order to recognise these differences the DSA rightfully uses the term “manifestly illegal”, but if fails to actually give a definition thereof. We agree with Alexandra Geese and the Greens/EFA Group that the wording of Recital 47 should make it into the definitions.
2. Renew Europe clarifying “by the service provider”
Article 2 of the DSA defines what “terms and conditions” are. This is of course necessary, but the definition omits one point, important for editing communities: Are we talking about rules created by the service provider or those by the community that interacts on the specific site and page? The first part of amendment 731 by a group of Renew Europe MEPs clarifies this.
AM 731 by Dita Charanzová, Andrus Ansip, Vlad-Marius Botoş, Morten Løkkegaard, Karen Melchior
Article 2 – paragraph 1 – point q
‘terms and conditions’ means all terms and conditions or specifications by the service provider, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services […]
We welcome this clarification, as it is an important one to community moderated projects like Wikipedia. Of course laws must be respected and the service provider has obligations. But there are also community specific rules, like Polish Wikipedia’s manual of style, which is created and maintained by the Polish language community, where we wouldn’t like service provider lawyers or lawmakers to call the shots.
3. Notices shouldn’t automatically trigger “actual knowledge”
Article 14.3. of the proposed DSA text is confusingly worded and reads like notices that fulfill formal requirements automatically give rise to “actual knowledge”. Actual knowledge refers to knowledge of illegal and infringing activity and would force the Wikimedia Foundation or the editing communities to act to remove the content in question. This seriously worries us, as most notices we get are clearly not about illegal or infringing content. More often than not it is about someone disliking an article or an image. And the vast majority of requests are not granted anyway; in fact, only 2 out of 380 were granted in the second half of 2020. This is why we support the amendment proposal 1056 by four ECR group MEPs.
AM1056 by Adam Bielan, Kosma Złotowski, Eugen Jurzyca, Beata Mazurek
Article 14 – paragraph 3
Notices that include the elements referred to in paragraph 2 shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned where there is no doubt as to the illegality of the specific item of content. In case of uncertainty and after taking reasonable steps to assess the illegality of the specific item of content, withholding from removal of the content by the provider shall be perceived as acting in good faith and shall not lead to waiving the liability exemption provided for in Article 5.
4. Surprise: A waiver for non-for-profits?
We do believe that the DSA will be most effective if it remains a horizontal framework that is valid for all platforms, unlike Article 17 of the Copyright in the Digital Single Market directive. However we also feel that writing universal rules that would work well, without too many exceptions, to all services online, is an almost insurmountable challenge. We were surprised to see Renew Europe’s fresh approach to this “squaring the circle” task by introducing a right for the European Commission to issue waivers for certain platforms for certain requirements. It reads as follows:
AM 894 by Dita Charanzová, Andrus Ansip, Vlad-Marius Botoş, Claudia Gamon, Morten Løkkegaard, Svenja Hahn, Karen Melchior, Sandro Gozi, Stéphanie Yon-Courtin, Liesje Schreinemacher
Article 9a (new)
1. Providers of intermediary services may apply to the Commission for a waiver from the requirements of Chapter III, proved that they are:
(a) non-for-profit or equivalent and serve a manifestly positive role in the public interest;
(b) micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC; or
(c) a medium enterprises within the meaning of the Annex to Recommendation 2003/361/EC without any systemic risk related to illegal content.
The Providers shall present justified reasons for their request.
2.The Commission shall examination such an application and, after consulting the Board, may issue a waiver in whole or in parts to the requirements of this Chapter. […]
This is an elegant way of squaring the circle and such waivers do exist in other parts of EU law where they work well. We believe it merits serious consideration.
Coming up: Compromise Amendments
Of course there are many ideas and proposals we left out. We simply wanted to help ourselves and you plunge back into the policy debate by highlighting some aspects. We are looking forward to engaging with stakeholders and lawmakers on getting the balances in the DSA just right. We want to keep responsible online communities, who are the digital version of civil society, in charge of the best places online.