Auditing Algorithmic Personalization in TikTok Comment Sections
Researchers used 17 partisan 'sock-puppet' accounts to audit 65 neutral political videos.
A new study from researchers Yueru Yan and Siqi Wu, accepted at ICWSM 2026, provides one of the first empirical audits of how TikTok's personalization algorithm shapes what users see in comment sections. The researchers created 22 accounts, including 17 validated 'sock-puppets' trained to exhibit clear left-leaning or right-leaning preferences based on their For You Page recommendations, and 5 cold-start accounts with no history. They then scraped the comment sections shown to these accounts across 65 politically neutral videos about the 2024 U.S. presidential election that contained abundant discussion from both sides.
The analysis revealed that while the overall pool of top comments remained largely consistent, the ranking and exposure of those comments differed significantly based on a user's inferred political leaning. For some videos, the divergence in comment ranking between accounts from different political groups was markedly greater than the variation within the same group. This personalization effect was strongly correlated with specific video-level metrics: higher comment volume, greater engagement inequality (likes/replies), and a stronger pre-existing partisan skew in the comment section all amplified the algorithmic sorting.
Through a case study, the researchers found preliminary evidence that this personalization can result in comment exposure that aligns with an account's political leaning, potentially reinforcing echo chambers. However, the study crucially notes this pattern is not universal; the extent of politically oriented comment personalization is highly context-dependent on the specific video's dynamics. This suggests the algorithm reacts to engagement signals within a comment section, rather than applying a blanket political filter.
- Study used 17 trained partisan accounts & 5 cold-start accounts to audit 65 neutral political videos on TikTok.
- Found significant ranking divergence in comments shown to left-leaning vs. right-leaning accounts, correlated with video engagement metrics.
- Provides evidence TikTok's algorithm can personalize comment sections to align with user politics, but effect is context-dependent.
Why It Matters
Reveals how platform algorithms may silently shape political discourse and reinforce ideological bubbles within seemingly neutral content.