Empirical Evaluation of Link Deletion Methods for Limiting Information Diffusion on Social Media
Deleting up to 50% of social network links only halves viral spread, challenging a key moderation strategy.
A new study by researchers Shiori Furukawa and Sho Tsugawa delivers a sobering reality check for social media platforms fighting misinformation. Published on arXiv (2603.21470), their work empirically evaluates 'link deletion'—a proposed method where platforms strategically remove connections between users to hinder the spread of harmful content. Unlike prior research that relied on synthetic diffusion models, this study uses real-world logs of retweet cascades, providing a more accurate assessment of the strategy's potential.
The results are striking and counterintuitive. The researchers found that even after aggressively deleting 10% to 50% of the links in a social network, the subsequent size of information cascades is estimated to be, at best, only 50% of their original potential spread. This 'optimistic estimation' reveals the method's limited effectiveness. A key reason for this inefficiency is the prevalence of cascades initiated by many 'seed users'—when harmful content is posted by numerous accounts simultaneously, cutting the links between them becomes a far less powerful tool for containment.
This research shifts the conversation from theoretical network models to practical platform policy. It suggests that content moderation strategies focusing solely on altering network structure, like link deletion, may need significant investment for marginal returns. The findings underscore the complexity of real-world information ecosystems, where multi-pronged approaches combining network interventions, content labeling, and user education might be necessary to effectively limit viral misinformation.
- Study used real retweet cascade data, not synthetic models, for accurate evaluation of link deletion.
- Deleting 10-50% of network links only reduces information spread by 50% at most, showing limited effectiveness.
- Strategy is inefficient against cascades with many seed users, a common real-world scenario for viral content.
Why It Matters
Challenges a core tech platform strategy, showing structural fixes alone are insufficient against viral misinformation.