Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Stefan Krause, Germany, FAL, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

content moderation

Wikipedia and the Digital Services Act: Lessons on the strength of community and the future of internet regulation

Written by Jacob Rogers, Associate General Counsel at the Wikimedia Foundation. Here, you may find the link to the original interview.

We share some considerations about the application of the recently adopted Digital Services Act (DSA), which lays down a new set of rules for online platforms. Under these new rules, Wikipedia has been designated as a VLOP and therefore bears some specific obligations. After one year of formal application, a first preliminary evaluation can be done. In this sense, the interview highlights Wikipedia’s specific characteristics, analyses the compliance burdens for the Wikimedia Foundation and offers some guidance for the future in order to preserve the Wikimedia model.

Read More »Wikipedia and the Digital Services Act: Lessons on the strength of community and the future of internet regulation

Europe Needs Digital Public Spaces That Are Independently Moderated and Hosted

Since the advent of online platforms, which allow users to openly share content, the structure of the public sphere has been transformed. Until just a few decades ago, we were used to the fact that a significant part of public discourse would take place in publicly-owned and publicly-controlled spaces, be it town squares, parks, city halls, cultural establishments or public broadcasters. In the digital world, what we think of as public spaces are actually mostly private, for-profit, and/or data-guzzling platforms and services. The only very large online platform that is not-for-profit and is maintained by a thriving community of users is Wikipedia. This gives people on the internet agency and empowers them. We believe that the huge dominance of the for-profit, data driven model is a main reason for the greater vulnerability that our societies are experiencing in terms of polarisation, disinformation, and hate. We are confident that using versatile and diverse models of operation of online services and their underlying infrastructure will make society more resilient. It will make democratic, inclusive societies a harder target. For these reasons, we want a significant part of the online public discourse to take place on public or not-for-profit platforms, services, and infrastructure. To achieve the above objective, we suggest three areas of action. 1. Institutional Support Ensure funding for a network of publicly-owned and operated platforms that can host digital cultural heritage and public debates. Regardless of whether we are speaking of a regional museum, a small municipality, or public school, whenever these institutions and communities want to run a project online or share information, they rely on a few dominant, data-monetising services, even for the most basic action of hosting content. European digital hosting infrastructure for public service and cultural institutions is important for sovereignty and… Read More »Europe Needs Digital Public Spaces That Are Independently Moderated and Hosted

DSA: Parliament adopts position on EU Content Moderation Rules

Yesterday the European Parliament adopted its negotiation position on the EU’s new content moderation rules, the so-called Digital Services Act. The version of the text prepared by the Committee on Internal Market and Consumer Protection (IMCO) was mostly adopted, but a few amendments were added. 

Read More »DSA: Parliament adopts position on EU Content Moderation Rules

Editorial: The DSA debate after Haugen and before the trilogues

If the EU really wants to revamp the online world, it should start shaping legislation with the platform models in mind it likes to support, instead of just going after the ones it dislikes.

Whistleblowers are important. They often provide evidence and usually carry conversations forward. They might be able to open the debate to new audiences. I am grateful to  Frances Haugen for having the courage to speak and the energy to do it over and over again across countries, as the discussion is indeed global. 

On the other hand the hearings didn’t reveal anything completely new, we didn’t learn something we didn’t already know. We live in a time where the peer-to-peer internet has essentially been replaced by a network of platforms, which, in their overwhelming majority, are for-profit, data-collecting and indispensable in everyday life. 

Read More »Editorial: The DSA debate after Haugen and before the trilogues

Takedown Notices and Community Content Moderation: Wikimedia’s Latest Transparency Report

In the second half of 2020 the Wikimedia Foundation received 380 requests for content alteration and takedown. Two were granted. This is because our communities do an outstanding job in moderating the sites. Something the Digital Services Act negotiators should probably have in mind.

See the organisational chart in full here

Wikipedia is a top 10 website globally anyone can edit and upload content to. Its sister projects host millions of files uploaded by users. Yet, all these projects together triggered only 380 notices. How in the world is this possible?

Read More »Takedown Notices and Community Content Moderation: Wikimedia’s Latest Transparency Report

Antiterrorists in a bike shed – policy and politics of the Terrorist Content Regulation

co-authored by Diego Naranjo, Head of Policy at EDRi

Analysis

In the second installment of series of longer features on our blog we analyse the political process around the terrorist content debates and key factors influencing the outcome.

The short story: an ill-fated law with dubious evidence base, targeting an important modern problem with poorly chosen measures, goes through an exhausting legislative process to be adopted without proper democratic scrutiny due to a procedural peculiarity. How did we manage to end up in this mess? And what does it tell us about the power of agenda setting the name of the “do something” doctrine?

How it started – how it’s going

A lot of bafflement accompanied the release of the Terrorist content regulation proposal. The European Commission published it a few days after the September 2018 deadline to implement the Directive on Combating Terrorism (2015/0625). It is still unclear what the rush was with the regulation if the preceding directive hadn’t got much traction. At that time, only a handful of Member States met the deadline for its implementation (and we don’t see a massive improvement in implementation across the EU to this day). Did it have to do with the bike-shed effect pervading modern policy-making in the EU? Is it easier to agree on sanitation of the internet done mostly by private corporate powers, than to meaningfully improve actions and processes addressing terrorist violence in the Member States?

Read More »Antiterrorists in a bike shed – policy and politics of the Terrorist Content Regulation

TERREG adopted without a final vote – what to expect and what it means

The Regulation on addressing the dissemination of terrorist content online (TERREG) has been adopted without a final vote thanks to a peculiarity in European Parliament procedure. The dangers of content filtering, over-policing of content by state and private actors, and the cross-border prerogatives for governments will now become law without a final stamp from the elected representatives of the European citizens.

What happened (and what didn’t)

A Plenary debate had been scheduled to discuss the draft legislation one last time. However, the voting list released for the Terrorist Content Regulation specified it would be approved without a final vote. A text that goes into so-called “second reading” – as the file in question was – is considered “approved without vote”, unless one of the political groups expressly requests a plenary vote. None of them did, so TERREG is considered as passed.

UPDATE: TERREG was published in the Official Journal of the EU on May 17th 2021. It enters into force 20 days from publication (June 7th 2021). It will apply from June 7th 2022.

On April 20th, LIBE  adopted what is now the final text with 52 Members of the European Parliament (MEPs) in favour of the draft legislation, including the Dutch MEP Sophia in ‘t Veld, a powerhouse in privacy and fundamental rights debates in the European Parliament. The 14 votes rejecting it came from members of the Greens with the TERREG Shadow Rapporteur Patrick Breyer at the helm, and the Left. 

Read More »TERREG adopted without a final vote – what to expect and what it means

How the DSA can help Wikipedia – or at least not hurt it

The Digital Services Act is probably the most consequential dossier of the current EU legislative term.  It will most likely become a formative set of rules on content moderation for the internet. It also means that it will shape the way Wikipedia and its sister projects operate. One can only hope that the DSA doesn’t try to fix what isn’t broken, specifically our community-based content moderation model. What are the scenarios?

A quick history of recent platform liability legislation

One of the reasons why the DSA became a thing, is the growing conviction that online intermediaries – from social media, through various user-generated content hosting platforms, to online marketplaces – will not fix the problems with illegal content through voluntary actions. In the previous legislative term we saw two proposals to change the responsibilities and liability of platforms. The focus was on types of content: copyrighted material (in the infamous Directive in Copyright in the Digital Single Market) and so-called terrorist content (in the Regulation on Dissemination of Terrorist content Online, or TERREG, with its final vote on April 28). 

The topical focus has its limitations, such as the number of legal regimes one platform would need to conform to simultaneously. This time around, the European Commission wants to impose rules on platforms that would cover all sorts of an intermediaries, content and services. 

Read More »How the DSA can help Wikipedia – or at least not hurt it

Dear MEPs, say NO to terrorist Content Regulation

We have the date of the final TERREG vote – it will happen during the Plenary of the European Parliament, on April 28. The MEPs will be presented with a regulation that is too blurry, too broad, and that infringes too much on our right to express political views and to access information. Together with EDRi, Access Now, Civil Liberties Union for Europe and over 60 other organisations, we urge the MEPs to stand on the right side of history and reject this proposal.

In the open letter, 60+ human rights organisations and journalist federations cite the danger of content filtering, the overpolicing of content by state and private actors, and the cross-border prerogatives as main reasons why the proposal should be rejected.

Read More »Dear MEPs, say NO to terrorist Content Regulation

TERREG: trilogue brings compromise in final weeks of German Presidency

Perhaps it was it the perspective of “losing face” by transferring this hot potato of a proposal to the next Presidency that created the pain point to press with the hosts of the negotiations. The European Parliament delegation managed to get quite a few of the issues they wanted ironed out and there will be no more trilogue on the proposal for the terrorist content regulation.

We bring you an update on what is the final outcome of the negotiations, what happens next, and a bit of a summary of what it means for us Wikimedians and for the world at large.

Successes and problems

1. Exception for journalists, artistic and educational purposes

Under pressure from the EP, journalist associations, and (hopefully) us, the doubtful legitimacy check of what is journalism, artistic expression or accepted research has been dropped. Article 1(2)(a) will exclude material disseminated for educational, journalistic, artistic or research purposes from the scope. Moreover, purposes of preventing or countering terrorism shall not be considered terrorist content including the content which represents an expression of polemic or controversial views in the course of public debate. Sounds like the most obvious obviousness, but hey – Twitch already deletes content denouncing terrorism to avoid the trouble. This provision plus those pointing at respecting fundamental rights while implementing measures can be interpreted in a way that actually coerces Twitch to stop deleting it.

Read More »TERREG: trilogue brings compromise in final weeks of German Presidency