When Transparency Falls Short: Auditing Platform Moderation During a High-Stakes Election
Social media platforms didn't change moderation during the 2024 EU elections...
A new study led by Benedetta Tessa and colleagues analyzed 1.58 billion self-reported moderation actions from the eight largest social media platforms in Europe, covering an eight-month period around the 2024 European Parliament elections. Leveraging the Digital Services Act (DSA) Transparency Database—which provides systematic, large-scale data on content moderation—the researchers aimed to assess whether platforms adjusted their enforcement strategies during a high-stakes political event. Surprisingly, the data showed no meaningful changes in moderation patterns before, during, or after the elections, suggesting that platforms did not adapt their practices to address increased systemic risks.
These findings raise critical questions about platform accountability and the effectiveness of transparency mechanisms. The authors note that initial concerns about platforms' transparency and accountability persist even one year after the Transparency Database launched. They highlight the limits of self-regulatory approaches and call for stronger enforcement and better data access mechanisms to ensure platforms meet their responsibilities in protecting democratic processes. The paper is available on arXiv under the identifier 2604.19285.
- 1.58 billion moderation actions analyzed from 8 major platforms over 8 months
- No significant changes in enforcement patterns detected during the 2024 EU election period
- Transparency and accountability issues remain one year after DSA Transparency Database launch
Why It Matters
Reveals that self-reported moderation data may mask platform inaction, threatening democratic integrity during elections.