Deep Dive: The new von der Leyen Commission
What we can expect for community-governed knowledge-sharing projects.
Read More »Deep Dive: The new von der Leyen CommissionWhat we can expect for community-governed knowledge-sharing projects.
Read More »Deep Dive: The new von der Leyen CommissionAuthor: Aline Blankertz, Policy and Public Sector Advisor, Wikimedia Deutschland
We all use online platforms, from Google Search to WhatsApp to Microsoft Office. It is about time that users also get a say in how they work. In reality, we are far from this. But the direction is clear: platform councils can make decisions according to democratic principles.
Read More »Guest Post: Platform Councils – How we control the power of platforms together1.1 million— the number of times the 2024 European Parliament election article on English Wikipedia has been viewed from May to June 2024. With another 37 language versions and additional millions of views globally, this page exemplifies Wikipedia’s role in informing the public about major political events. Yet, Wikipedia’s impact extends far beyond that. As a widely used repository of knowledge, its content is frequently cited by other media outlets, amplifying its reach and embedding its information within broader public discourse. This means that any inaccuracies or disinformation on Wikipedia could have significant consequences on the public discourse, especially when it comes to sensitive issues like elections.
Read More »Wikimedia Europe Partners for Research into Wikipedia’s Practices on Information on ElectionsWritten by Jan Gerlach, Director of Public Policy at the Wikimedia Foundation; Phil Bradley-Schmieg, Lead Counsel at the Wikimedia Foundation; and, Michele Failla, Senior EU Policy Specialist at Wikimedia Europe
(Wikimédia France, the French national Wikimedia chapter, has also published a blog post on the SREN bill)
The French legislature is currently working on a bill that aims at securing and regulating digital space (widely known by its acronym, SREN). As currently drafted, the bill not only threatens Wikipedia’s community-led model of decentralized collaboration and decision-making, it also contradicts the EU’s data protection rules and its new content moderation law, the Digital Services Act (DSA). For these reasons, the Wikimedia Foundation and Wikimedia Europe call on French lawmakers to amend the SREN bill in order to make sure that public interest projects like Wikipedia are protected and can continue to flourish.
Read More »Wikipedia will be harmed by France’s proposed SREN bill: Legislators should avoid unintended consequencesEuropean Union (EU) lawmakers have agreed on a political deal to establish general online content moderation rules. Several cornerstones include a notice-and-action regime, Commission oversight over very large platforms and certain rules for terms of services.
After the political deal, some technical wording remains to be worked on. The deal is expected to be voted on in Parliament in July 2022. We have previously compared the three stances from a free knowledge point of view. We also analysed the state of negotiations in April 2022. Here is an analysis of the trilogue deal, based on what we know.
We welcome that during the deliberations lawmakers began making a distinction between rules created and imposed by the services provider and rules written and applied by volunteer editing communities. It is a pity that “citizen moderation”, something the internet needs more of, wasn’t recognised explicitly.
Read More »DSA: Political Deal done!European Union (EU) lawmakers are under a lot of self-imposed pressure to reach an agreement on content moderation rules that will apply to all platforms. Several cornerstones have been placed either at the highest political levels (e.g., banning targeted ads directed at minors) or agreed upon on a technical level (e.g., notice-and-action procedures). But there is still no breakthrough on a few other articles, like the newly floated “crisis response mechanism.”
Read More »DSA: Trilogues UpdateWhen Russia launched its large-scale invasion of Ukraine on 24 February, volunteer editors on Wikipedia leapt into gear to document developments as they unfolded. A look at the record of their edits provides a window behind the scenes of breaking news editing.
Read More »Wikipedia covers the Ukraine invasion: a look at breaking news editingYesterday the European Parliament adopted its negotiation position on the EU’s new content moderation rules, the so-called Digital Services Act. The version of the text prepared by the Committee on Internal Market and Consumer Protection (IMCO) was mostly adopted, but a few amendments were added.
Read More »DSA: Parliament adopts position on EU Content Moderation RulesThe EU is working on universal rules on content moderation, the Digital Services Act (DSA). Its co-legislators, the European Parliament (EP) and the Council, have adopted their respective negotiating positions in breakneck time by Brussels standards. Next, they will negotiate a final version with each other.
While the EP’s plenary vote on the DSA is up in January and amendments are still possible, most changes parliamentarians agreed upon will stay. We therefore feel that this is a good moment to look at what both houses are proposing and how it may reflect on community-driven projects like Wikipedia, Wikimedia Commons and Wikidata.
Just before the summer recess, the European Parliament’s Internal Market and Consumer Protection committee released over 1300 pages of amendments to the EU’s foremost content moderation law. It took the summer to delve into the suggestions and are ready to kick off the new Parliamentary season by sharing some thoughts on them. Our main focus remains on how responsible communities can continue to be in control of online projects like Wikipedia, Wikimedia Commons and Wikidata.
AM 691 by Alexandra Geese on behalf of the Greens/EFA Group
Article 2 – paragraph 1 – point g a (new)
‘manifestly illegal content’ means any information which has been subject of a specific ruling by a court or administrative authority of a Member State or where it is evident to a layperson, without any substantive analysis, that the content is in not in compliance with Union law or the law of a Member State;
Almost any content moderation system will require editors or service providers to assess content and make ad-hoc decisions on whether something is illegal and therefore needs to be removed or not. Of course, things aren’t always black-and-white and sometimes it takes a while to make the right decision, like with leaked images of Putin’s Palace. Other times it is immediately clear that something is an infringement, like a verbatim copy of a hit song, for instance. In order to recognise these differences the DSA rightfully uses the term “manifestly illegal”, but if fails to actually give a definition thereof. We agree with Alexandra Geese and the Greens/EFA Group that the wording of Recital 47 should make it into the definitions.
Read More »DSA in imco: Three amendments we like and one that surprised us