There is a machine learning service available to interested Wikimedia projects and communities called ORES. It aims to recognise if an edit, for instance on Wikipedia, is damaging or done in good faith. Of course, false predictions cannot be avoided and thus remain a major risk. Here’s how we try to handle it.
Just before the summer recess, the European Parliament’s Internal Market and Consumer Protection committee released over 1300 pages of amendments to the EU’s foremost content moderation law. It took the summer to delve into the suggestions and are ready to kick off the new Parliamentary season by sharing some thoughts on them. Our main focus remains on how responsible communities can continue to be in control of online projects like Wikipedia, Wikimedia Commons and Wikidata.
1. The Greens/EFA on “manifestly illegal content”
AM 691 by Alexandra Geese on behalf of the Greens/EFA Group
Article 2 – paragraph 1 – point g a (new)
‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
Almost any content moderation system will require editors or service providers to assess content and make ad-hoc decisions on whether something is illegal and therefore needs to be removed or not. Of course, things aren’t always black-and-white and sometimes it takes a while to make the right decision, like with leaked images of Putin’s Palace. Other times it is immediately clear that something is an infringement, like a verbatim copy of a hit song, for instance. In order to recognise these differences the DSA rightfully uses the term “manifestly illegal”, but if fails to actually give a definition thereof. We agree with Alexandra Geese and the Greens/EFA Group that the wording of Recital 47 should make it into the definitions.Read More »DSA in imco: Three amendments we like and one that surprised us
The European Commission wants more European data (public, private and personal) to be shared for the purposes of innovation, research and business. It also wants to avoid a system where only a few large platforms control all the data. It thus wants to create mechanisms and tools to get there. That’s commendable! What the Commission proposes in the Data Governance Act (DGA), though, is at times very unclear.
Here is a breakdown of the European Commission proposals by sector, peppered with our take on some relevant aspects and support for some European Parliament and Council amendments.
Public Sector Data
DGA creates a mechanism for re-using protected public sector data (e.g. because of privacy rules, statistical confidentiality or IP) . Public sector bodies are to establish secure environments where data can be mined within the institution. Anonymised data could be provided through outside of the institution, if the re-use can’t happen within its infrastructure.Read More »Data Governance Act: Good Intentions, Bad Definitions
In the second half of 2020 the Wikimedia Foundation received 380 requests for content alteration and takedown. Two were granted. This is because our communities do an outstanding job in moderating the sites. Something the Digital Services Act negotiators should probably have in mind.
See the organisational chart in full here
Wikipedia is a top 10 website globally anyone can edit and upload content to. Its sister projects host millions of files uploaded by users. Yet, all these projects together triggered only 380 notices. How in the world is this possible?Read More »Takedown Notices and Community Content Moderation: Wikimedia’s Latest Transparency Report
The Regulation on European production and preservation orders for electronic evidence in criminal matters (E-Evidence) aims to create clear rules on how a judicial authority in one Member State can request electronic evidence from a service provider in another Member State. One such use case would be requesting user data from a platform in another EU country during an investigation. We wrote about our main issues in the past.
What Wikimedia worries about
At Wikimedia we were originally worried mainly about a new data category – access data. This would mean that prosecutors would be able to demand information such as IP addresses, date and time of use, and the “interface” accessed, without judicial oversight. In the Wikipedia context, however, this information would also reveal which articles a user has read and which images she has looked at.
The second aspect we care about is whether the service provider’s hosting country’s authority will have the right to intervene in some cases where fundamental rights of its citizens are concerned. We know that unfortunately not all EU Member States have good rule of law records, which calls for safeguards at least against potential systemic abuse. Again, knowing which Wikipedia articles or which Wikimedia Commons images someone opened is information that should be hard to get and only in rare and well justified cases.Read More »E-Evidence: trilogues kick off on safeguards vs. efficiency
A new EU regulation aims to streamline the process by which a prosecutor from one EU Member State can request electronic evidence from a server in another Member State. As current procedures are messy, this is necessary. But the current proposal would also mean that prosecutors could request data about who has read which Wikipedia article without judicial oversight and without a possibility for the country’s authority that hosts the platform to intervene in case of fundamental rights breaches. That is worrisome!
The Wikimedia Foundation gathers very little about the users and editors on its projects, including Wikipedia. This is how the Wikimedia movement can ensure that everyone is really free to speak their mind and, for instance, share information that may be critical of a government in the country they live in. However, the Foundation’s servers do record the IP addresses of users who have accessed Wikipedia, and the individual articles they have viewed. In accordance with the Wikimedia community’s support for strong privacy protections, the Foundation keeps this information for a few months as part of the way its servers function before it is deleted. Allowing access to these IP addresses and the articles that the users behind those IP addresses have read — without judicial oversight — is the issue with the European Commission and Council proposals for an E-Evidence Regulation.Read More »E-Evidence: Let’s Keep Reader Data Well Protected!