Enterprise & Industry

Chinese man blasts ghost stories over speakers for 10 hours daily to retaliate against neighbour

A man in China weaponized eerie AI-generated audio, blasting it for over 10 hours daily within legal noise limits.

Deep Dive

A bizarre neighbor dispute in Guangzhou, China, has escalated into a viral case of tech-enabled psychological warfare. Following a conflict with neighbor surnamed Xie, a man surnamed Lu and his cohabiter, Li, deployed a loudspeaker directly against their shared wall. For daily sessions spanning from 8:45 AM to noon and again from 3:30 PM to 10:00 PM, they blasted a continuous loop of eerie, AI-generated 'ghostly mountain sounds'—essentially ghost stories designed to unnerve. The calculated nature of the harassment is what makes it notable: the perpetrators deliberately kept the audio's decibel level just below the legal threshold for noise pollution, a tactic aimed at evading formal penalties from authorities.

The case was publicly revealed by China's Ministry of Justice, drawing widespread ridicule and discussion online. While the exact cause of the initial dispute remains unclear, the impact was significant, disturbing not only the intended target but also residents in the upstairs apartment. This incident underscores how accessible AI voice and audio generation tools can be weaponized in everyday conflicts, creating a new category of harassment that operates in a legal gray area. It serves as a stark, real-world example of the unintended societal consequences of democratized AI technology, moving beyond deepfakes and scams into the realm of personal vendettas.

Key Points
  • Weaponized AI audio: The perpetrators used AI-generated 'ghostly mountain sounds' for psychological harassment.
  • Strategic legality: Volume was kept deliberately below legal noise limits to avoid official penalties.
  • Widespread disruption: The 10+ hour daily broadcasts disturbed the target neighbor and upstairs residents, leading to a Ministry of Justice case.

Why It Matters

Highlights how AI tools are creating new, legally ambiguous forms of harassment in everyday life, challenging existing social and legal frameworks.