Suno is a music copyright nightmare
A Verge investigation shows Suno's copyright filters are easily fooled, enabling AI-generated covers of major hits.
A new investigation by The Verge exposes critical weaknesses in the copyright protection systems of the popular AI music generator Suno. The platform's policy explicitly forbids using copyrighted material, but its filters are easily bypassed with minimal technical effort. Reporters found that using free software like Audacity to slightly alter a source track—for example, by slowing it to half-speed or adding bursts of white noise—allowed Suno's 'Studio' feature to process well-known hits like Beyoncé's "Freedom" and Black Sabbath's "Paranoid." Once uploaded, the AI then generates a new instrumental track closely mimicking the original arrangement, creating what are essentially AI-powered cover versions.
These generated tracks exist in an unsettling 'uncanny valley,' often sounding like alternate takes or B-sides of the original songs. The report also found that Suno's lyric filter, designed to block copyrighted text, could be tricked with minor spelling changes. Perhaps more alarmingly, the investigation showed that music from independent and lesser-known artists faced even less protection, with some tracks passing through the filters completely unaltered. This vulnerability opens the door for bad actors to potentially monetize these AI-generated covers by uploading them to streaming platforms, posing a direct threat to artists' intellectual property and revenue streams. Suno declined to comment on the findings, leaving questions unanswered about how it will address these glaring security flaws in its system.
- Suno's copyright filters for audio are bypassed by simple speed/pitch changes or adding white noise using free tools like Audacity.
- The platform's lyric filter is also vulnerable, failing to block copyrighted text after minor spelling alterations to song lyrics.
- The flaw is more severe for indie artists, whose original tracks sometimes pass through Suno's detection system with no changes at all.
Why It Matters
This exposes a critical gap in AI content moderation, enabling mass copyright infringement and threatening artist livelihoods on streaming platforms.