Skip to content

Wikimedia Europe

Visual Portfolio, Posts & Image Gallery for WordPress

Stefan Krause, Germany, FAL, via Wikimedia Commons

Charles J. Sharp, CC BY-SA 4.0, via Wikimedia Commons

NASA Goddard Space Flight Center from Greenbelt, MD, USA, Public domain, via Wikimedia Commons

Markus Trienke, CC BY-SA 2.0, via Wikimedia Commons

JohnDarrochNZ, CC BY-SA 4.0, via Wikimedia Commons

Benh LIEU SONG (Flickr), CC BY-SA 4.0, via Wikimedia Commons

Michael S Adler, CC BY-SA 4.0, via Wikimedia Commons

DEM-Debate project: the Critical Legal Analysis

The latest deliverable of the DEM-Debate project authored by the University of Amsterdam explores how the new EU legal framework on election disinformation applies to Wikipedia. The legal analysis evaluates, through critical lenses, the impact of the new rules on the functioning of community-governed platforms in addressing disinformation related to the 2024 European Parliament elections, drawing some preliminary conclusions on how to inform policy making: Wikipedia editorial rules together with its patrolling system are good examples from which future legislation on election disinformation can draw inspiration.

The report starts by accounting for the latest developments in the application of the EU disinformation legal framework, including two rulings of the European Court of Human Rights and the stance adopted by the American administration and legislative bodies towards the Wikimedia Foundation (WMF). Then, it details the findings of the critical analysis of the EU legal framework.

Definition of disinformation

It highlights the lack of a legal definition of “disinformation”, pointing out that only the Code of Conduct of Disinformation, which has a non binding nature, offers some guidance in this respect placing emphasis on the element of “intention”. The same is true for the terms of use of the WMF defining “false information”.

Value of information published on Wikipedia 

It points out that the information published on Wikipedia, due to its encyclopedic character and specific editorial rules, appears to have public significance differently to social media platforms, thus receiving the highest level of protection under the fundamental right to freedom of expression.

Foreign government disinformation

It explains that the Wikipedia communities mechanisms to choose reliable and deprecated sources are fully capable of reflecting the EU rules on state-controlled media outlets engaging in disinformation.

DSA obligations on disinformation

It also accounts how the WMF, which is the legal host of the platform and therefore bound by the DSA provisions, implemented these new obligations relating to disinformation, i.e. removal orders of illegal content, assessment of systemic risks and adoption of mitigation measures, and external audit. Researchers pointed out that “the Wikipedia measures applicable to disinformation are premised on the notion of transparency, where all measures are sought to be taken in a transparent way, through the community-moderated model, with all edit history and discussion history visible. Wikipedia’s measures applicable to disinformation are arguably very much open to the public already”.

Enforcement of DSA obligations

The evaluation of the enforcement of the DSA disinformation obligations is quite positive. In this sense, researchers emphasised that “it must be remembered that while there is considerable regulatory activity under the DSA, Wikipedia is the only sole VLOP, out of a total of 25 VLOPs, that has not been subject to any regulatory activity by the European Commission under the DSA as of October 2025. As such, following two years of the DSA’s provisions being applicable to VLOPs, Wikipedia has not (yet) come to regulatory attention; while none of the post-European Parliament election reports by the European Commission published in June 2025 mention Wikipedia”. Such a result can be interpreted to be linked to the effectiveness of the platform business model and its transparency.

Informing policy making: editorial rules & patrolling system

Researchers offer a preliminary overview on the specific solutions adopted by Wikipedia that may inform future policy making on election disinformation. 

In particular, they point out that the specific editorial rules, including neutral point of view, verifiability, no original research, and the policy on biography of living people, “illustrate how, for example, treating information about politicians and political actors with heightened care can help prevent election-related disinformation, and can provide learnings for broader regulation in the online environment”.
They also detail how community-based content moderation takes place and conclude that “Wikipedia’s patrolling system is a notable example of how, through community-based oversight, accountability and transparency can be operationalised. The idea of setting up patrols for political and election-related pages can also inform the wider online information ecosystem, demonstrating how targeted monitoring can help prevent the spread of election-related disinformation”.

Download the Report for more details.

Disclaimer. The sole responsibility for any content supported by the European Media and Information Fund lies with the author(s) and it may not necessarily reflect the positions of the EMIF and the Fund Partners, the Calouste Gulbenkian Foundation and the European University Institute. https://gulbenkian.pt/emifund/disclaimer/