Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

Stefan Krause, Germany, FAL, via Wikimedia Commons

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

TERREG: trilogue brings compromise in final weeks of German Presidency

Perhaps it was it the perspective of “losing face” by transferring this hot potato of a proposal to the next Presidency that created the pain point to press with the hosts of the negotiations. The European Parliament delegation managed to get quite a few of the issues they wanted ironed out and there will be no more trilogue on the proposal for the terrorist content regulation.

We bring you an update on what is the final outcome of the negotiations, what happens next, and a bit of a summary of what it means for us Wikimedians and for the world at large.

Successes and problems

1. Exception for journalists, artistic and educational purposes

Under pressure from the EP, journalist associations, and (hopefully) us, the doubtful legitimacy check of what is journalism, artistic expression or accepted research has been dropped. Article 1(2)(a) will exclude material disseminated for educational, journalistic, artistic or research purposes from the scope. Moreover, purposes of preventing or countering terrorism shall not be considered terrorist content including the content which represents an expression of polemic or controversial views in the course of public debate. Sounds like the most obvious obviousness, but hey – Twitch already deletes content denouncing terrorism to avoid the trouble. This provision plus those pointing at respecting fundamental rights while implementing measures can be interpreted in a way that actually coerces Twitch to stop deleting it.

2. Definitions of hosting service provider and terrorist content

Those hosting service providers (HSP) who enable users to share content to the public (potentially unlimited number of persons) are under the scope. So we avoided an interpretation that may lead to policing private communications or storage for private purposes, which would obviously be a disaster. At the same time, Wikipedia for example is in the scope as it fits the definition of the proposed regulation.

As for the definition of terrorist content, it is now closer tied to the definition of terrorist offences from the Directive on combatting terrorism. Also the “advocating” (of commission of terrorist offences) is no longer there, which is good as it is not a well defined activity and therefore open to interpretation. Still the qualifiers such as “glorification” of terrorist acts constitute terrorist content, which again is not a very sharp category. But it is better than many of the initial versions by either EC or various presidencies, so  I guess we should be glad that it is not worse…?

“TERREG gives eager governments a tool to react to what people say online about their actions from a dangerous angle. Decide for yourself if your government can be considered eager to do so.”

3. One hour rule

That one is still in – meaning that after receiving a removal order a platform has one hour to remove the content in question – but should do it as soon as possible so 1h is maximum time to do so. If an authority issues an order to a platform for the first time, it needs to give 12h heads-up that it will be coming. The rule has been softened a bit by introducing “justifiable technical and operational reasons” to not comply with the order on time. Now, what they are is quite discretionary and possibly different across EU jurisdictions.

4. Upload filters kind of out/in

That is the nature of a compromise to leave all parties equally unhappy. What we end up with is a prohibition for the authorities to impose upload filters. HSPs can still use them voluntarily. In practice, it would depend on how much the platforms are used to deploying them and how much authorities want to press the providers into the use these measures without explicitly asking them to do so.

5. Competent authorities

Competent authorities will not have to be judicial or independent. There is, however, the following safeguard: “Competent authorities shall not seek or take instructions from any other body in relation to the exercise of the tasks assigned to them pursuant to [this Regulation]” How it will be respected, hard to know; proving it is not respected (and to whom) even harder.

6. Cross-border removal orders

The desired outcome on that topic has not been reached – there will be a possibility for a Member State to remove content deemed terrorist created by a user in another Members State, in any language, etc.

What comes next?

Plenary: the adopted file will be sent back to the European Parliament next – meaning in 2021, most likely in the beginning. The Plenary vote which will be the end for the debates on this file – as it is a regulation it will be directly binding in all Member States.

What does it mean?

As you know, we were not fans of this regulation. It will come with a new set of problems to face, and it does not offer any concrete reason to trust that it will target the content that is already illegal; and not all sorts of communications that have the right to be voiced and debated. On top of that, it gives eager governments a tool to react to what people say online about their actions from a dangerous angle. Decide for yourself if your government can be considered eager to do so.

Wikimedia’s projects are in the scope of the terrorist content regulation. We will parse it out what it means exactly for our community closer to the adoption by the European Parliament. More indirectly, this regulation may affect our sources such as articles, since as we know the best exception is not half as good as a decent definition.