As elections across the EU face growing threats from disinformation, Wikipedia stands out as a unique case study in how community-governed platforms work to safeguard information integrity. A new mapping report, part of the DEM-Debate project, explores Wikipedia’s policies and risk-mitigation strategies for combating election disinformation
Unlike commercial social media platforms, Wikipedia relies on a decentralised, volunteer-driven model, where thousands of editors enforce editorial principles—such as neutral point of view, verifiability, and no original research—to ensure accuracy. The report highlights how volunteer human patrollers, helped by automated tools, page protection mechanisms, and real-time monitoring during elections help prevent manipulation. In parallel, the legal host, the Wikimedia Foundation provides institutional support through policies like the Universal Code of Conduct, a Disinformation Repository and Disinformation Response Teams (DRTs) set up by its Trust & Safety team.
The report also dives into Wikipedia’s first systemic risk assessment under the DSA, identifying election disinformation as a critical risk and detailing mitigation measures like volunteer moderation, task forces, and rapid-response channels for urgent threats.
Looking ahead, the findings from this report will shape policy recommendations for how the EU can better support community-governed platforms in the fight against disinformation. As disinformation evolves—especially with the rise of AI-generated content—we need to blend of human oversight, technological tools, and institutional safeguards to ensure a resilient information ecosystem and information integrity.

Disclaimer. The sole responsibility for any content supported by the European Media and Information Fund lies with the author(s) and it may not necessarily reflect the positions of the EMIF and the Fund Partners, the Calouste Gulbenkian Foundation and the European University Institute. https://gulbenkian.pt/emifund/disclaimer/








