Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Stefan Krause, Germany, FAL, via Wikimedia Commons

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

We are Wikimedians working on EU policy to foster
free knowledge, access to information and freedom of expression.

E-Evidence: trilogues kick off on safeguards vs. efficiency

The Regulation on European production and preservation orders for electronic evidence in criminal matters (E-Evidence) aims to create clear rules on how a judicial authority in one Member State can request electronic evidence from a service provider in another Member State. One such use case would be requesting user data from a platform in another EU country during an investigation. We wrote about our main issues in the past.

What Wikimedia worries about

At Wikimedia we were originally  worried mainly about a new data category – access data. This would mean that prosecutors would be able to demand information such as IP addresses, date and time of use, and the “interface” accessed, without judicial oversight. In the Wikipedia context, however, this information would also reveal which articles a user has read and which images she has looked at. 

The second aspect we care about is whether the service provider’s hosting country’s authority will have the right to intervene in some cases where fundamental rights of its citizens are concerned. We know that unfortunately not all EU Member States have good rule of law records, which calls for safeguards at least  against potential systemic abuse. Again, knowing which Wikipedia articles or which Wikimedia Commons images someone opened is information that should be hard to get and only in rare and well justified cases.

Read More »E-Evidence: trilogues kick off on safeguards vs. efficiency

Sanctioning the giants – will the internet be better with the Digital Markets Act?

Many would agree that the issues plaguing the online ecosystem are too many to fix for one act of law. So the European Commission drafted two legislative proposals: the long expected Digital Services Act (DSA) and the Digital Markets Act (DMA). Will the DMA prove to be an adequate instrument in the efforts to improve competition in the digital market? Or is it a missed chance to fix structural problems in access to information and knowledge?

The rogues are rogue because we let them

It was not a secret that a regulatory push in the realm of competition was considered by the European Commission. First, because of the multiple probes into practices by big tech, which have been launched by the EC in recent years. Second, because Margrethe Vestager, the Commissioner for Competition and EU’s Executive Vice-President responsible for A Europe Fit for the Digital Age had said so. Third and finally, because it is enough to look at a handful of internet companies which, rather than competing on the market, create global markets of their own, to see that some sort of intervention could benefit users and businesses alike.

The European Union offers a unique environment, where regulating a market influences all 27 Member States and almost 448 million people. Therefore, even globally operating companies will accept a legislative “offer” imposed across the federated part of the continent, even if it is tough on them.

UPDATE: we submitted feedback to the EC consultations on DMA

Read More »Sanctioning the giants – will the internet be better with the Digital Markets Act?

Dear MEPs, say NO to terrorist Content Regulation

We have the date of the final TERREG vote – it will happen during the Plenary of the European Parliament, on April 28. The MEPs will be presented with a regulation that is too blurry, too broad, and that infringes too much on our right to express political views and to access information. Together with EDRi, Access Now, Civil Liberties Union for Europe and over 60 other organisations, we urge the MEPs to stand on the right side of history and reject this proposal.

In the open letter, 60+ human rights organisations and journalist federations cite the danger of content filtering, the overpolicing of content by state and private actors, and the cross-border prerogatives as main reasons why the proposal should be rejected.

Read More »Dear MEPs, say NO to terrorist Content Regulation

E-Evidence: Let’s Keep Reader Data Well Protected!

A new EU regulation aims to streamline the process by which a prosecutor from one EU Member State can request electronic evidence from a server in another Member State. As current procedures are messy, this is necessary. But the current proposal would also mean that prosecutors could request data about who has read which Wikipedia article without judicial oversight and without a possibility for the country’s authority that hosts the platform to intervene in case of fundamental rights breaches. That is worrisome!

The Wikimedia Foundation gathers very little about the users and editors on its projects, including Wikipedia. This is how the Wikimedia movement can ensure that everyone is really free to speak their mind and, for instance, share information that may be critical of a government in the country they live in. However, the Foundation’s servers do record the IP addresses of users who have accessed Wikipedia, and the individual articles they have viewed. In accordance with the Wikimedia community’s support for strong privacy protections, the Foundation keeps this information for a few months as part of the way its servers function before it is deleted. Allowing access to these IP addresses and the articles that the users behind those IP addresses have read — without judicial oversight — is the issue with the European Commission and Council proposals for an E-Evidence Regulation.

Read More »E-Evidence: Let’s Keep Reader Data Well Protected!

TERREG: trilogue brings compromise in final weeks of German Presidency

Perhaps it was it the perspective of “losing face” by transferring this hot potato of a proposal to the next Presidency that created the pain point to press with the hosts of the negotiations. The European Parliament delegation managed to get quite a few of the issues they wanted ironed out and there will be no more trilogue on the proposal for the terrorist content regulation.

We bring you an update on what is the final outcome of the negotiations, what happens next, and a bit of a summary of what it means for us Wikimedians and for the world at large.

Successes and problems

1. Exception for journalists, artistic and educational purposes

Under pressure from the EP, journalist associations, and (hopefully) us, the doubtful legitimacy check of what is journalism, artistic expression or accepted research has been dropped. Article 1(2)(a) will exclude material disseminated for educational, journalistic, artistic or research purposes from the scope. Moreover, purposes of preventing or countering terrorism shall not be considered terrorist content including the content which represents an expression of polemic or controversial views in the course of public debate. Sounds like the most obvious obviousness, but hey – Twitch already deletes content denouncing terrorism to avoid the trouble. This provision plus those pointing at respecting fundamental rights while implementing measures can be interpreted in a way that actually coerces Twitch to stop deleting it.

Read More »TERREG: trilogue brings compromise in final weeks of German Presidency

Upside-down: is all content terrorist until determined otherwise?

The German Presidency of the EU is accelerating the Trilogue negotiations around the terrorist content regulation (TERREG). Yet, faster doesn’t always mean better, as the German compromise text proves. The most disturbing ideas in the compromise pose an attack on freedom and pluralism of the media and of arts and sciences. Is the new text a lapse of judgment or a glimpse into how a modern EU government envisions its powers over democratic discourse and the role of tech in it?

Media and arts with the seal of approval of governments?

One of the issues with the proposal for a regulation to prevent the dissemination of terrorist content online was, from the beginning, a blurry definition of what constitutes “terrorist content”. The German Presidency proposes to exclude materials disseminated for educational, journalistic, artistic or research purposes from that definition under the condition that “the dissemination of the information is protected as legitimate exercise of freedom of expression and information, the freedom of the arts and sciences as well as the freedom and pluralism of the media”. 

This raises questions about what may or may not constitute “legitimate journalism” or “legitimate artistic expression.” And, importantly, about who gets to decide what is legitimate reporting or legitimate educational purpose. As the proposal stipulates so far, it will not be the court deciding, but competent authorities in each Member State and also the internet platforms hosting the content. 

Read More »Upside-down: is all content terrorist until determined otherwise?

Terrorist clicks? drastic measures to moderate online communications under the anti-terrorist banner

What is the best way to combat terrorism? According to the European Commission, it is to clean the internet of terrorist content. Despite little clarity as to what terrorist content really is, the EU institutions are working towards a new regulation that would likely require a wide range of online services to follow the ill-designed measures – measures that would also affect Wikipedia. Yet, the lack of clear definitions, combined with proposed requirements to filter or immediately remove information, threatens democratic discourse and online collaboration.

What is this terrorist content regulation about, again?

In the autumn 2018, the European Commission (EC) published a proposal of a Regulation on preventing the dissemination of terrorist content online (TERREG) as a part of the Digital Single Market framework. Its framing suggests that curbing terrorism is not the main objective of this piece of legislation. Instead, it seeks to provide internet platforms with unified rules on how to deal with content that is considered terrorist, across the EU, and to outline consequences of their failing to comply. 

The rules boil down to a bypass of judiciary oversight in limiting freedom of expression, instead transfering that power to private actors: platforms hosting their users’ content, and content filters. All this makes it very easy to restrict access to information about unfolding civic events, which can sometimes produce violent imagery. Meanwhile, failure to remove disturbing, violent content upon notice is already punishable under the EU law and only 6% of European internet users report coming across what they perceive to be terrorist content (according to a Flash Eurobarometer poll from 2018). 

Read More »Terrorist clicks? drastic measures to moderate online communications under the anti-terrorist banner

Terrorist content and Avia Law – implications of constitutionality of TERREG in France

Analysis

In the first of series of longer features on our blog, we study the implications of national court ruling on the future of an EU regulation: in this case on TERREG.

In June 2020, France’s Constitutional Court issued a decision that contradicts most key aspects of the EU proposal for a regulation on preventing the dissemination of terrorist content online – but also gave EU legislators specific tools to prevent drafting legislations pertaining to content regulation that would directly contradict fundamental rights and national constitutional requirements.

Download as pdf

Introduction

Over the course of the past two years, France had a lively debate on a draft bill to combat hate speech online (the so-called Avia law). The debate mainly revolved around imposing stricter content removal obligations for both platforms and other intermediaries such as hosting providers. The final law, passed in May 2020, included the obligation for hosting providers to remove terrorist content and child sex abuse material within the hour of receiving a blocking order by an administrative authority. The law also foresaw a 24-hour deadline for platforms to remove hate speech content, based on flagging by either a user, or trusted flaggers – based on the platforms’ own judgement and with the help of technical measures. This content removal activity was supposed to be subject to guidelines that were to be established by the French Media Regulator (CSA). 

Read More »Terrorist content and Avia Law – implications of constitutionality of TERREG in France

4 things EU legislators can do to not break the internet (again)

co-authored by Jan Gerlach

The European Commission’s proposal for a Regulation on preventing the dissemination of terrorist content online runs the risk of repeating many of the mistakes written into the copyright directive, envisioning technological solutions to a complex problem that could bring significant damage to user rights. The proposal includes a number of prescriptive rules that will create frameworks for censorship and potentially harm important documentation about terrorism online. It would further enshrine the rule and power of private entities over people’s right to discuss their ideas.

However, there are still ways to shape this proposal to further its objectives and promote accountability. The report on the proposal will be up for a vote in the Civil Liberties and Justice Committee (LIBE) in the European Parliament on 8 April, and Wikimedia urges the committee to consider the following advice:

1. Stop treating the internet like one giant, private social media platform

Read More »4 things EU legislators can do to not break the internet (again)