Cedrik Sixtus got tired of the pretense. The Leipzig software developer noticed his Spotify playlists filling up with tracks that sounded algorithmically generated, so he built a tool to identify and block them. He uploaded it to code-sharing sites where hundreds downloaded it, each one a small act of resistance against what he saw as an inevitability the platform refused to acknowledge.
His tool filters out over 4,700 suspected AI artists. It works by tracking unusually high release volumes, AI-style cover art, and external detection tools. Simple enough. Yet Spotify, the world’s most popular streaming service, still won’t do what Sixtus does for free.
The gap between what listeners want and what Technology companies deliver has rarely felt wider.
The Demand is Clear
In a controlled test run by Deezer and Ipsos, 97% of listeners couldn’t distinguish between AI-generated and human-made tracks. Yet around 80% of respondents said AI-generated music should be clearly labelled. The public knows something is off, and they want to know what it is.
That’s not an outlandish ask. We label food as organic. We list nutritional information. We tell consumers when content has been manipulated. Music feels like it should deserve the same transparency, especially as tens of thousands of AI tracks are uploaded to streaming platforms daily, potentially diluting revenue pools for human artists.
Spotify’s response has been cautious to the point of evasion. In April, it launched a test feature showing how artists used AI in a song’s credits. But here’s the catch: it’s voluntary, based entirely on what artists tell their labels or distributors. That’s not transparency. That’s trusting people to self-police when stigma incentivizes silence.
Apple Music announced it would eventually require self-disclosure through “transparency tags,” but observers have already flagged the obvious flaw: if an artist fears their work will be devalued by an AI label, why disclose?
The Complexity Trap
To be fair, the problem isn’t quite as binary as it sounds. AI music exists on a spectrum. Some tools are “prompt in, song out” affairs where labelling would be straightforward. Others are designed for co-creation, assisting with specific parts of the songwriting process. If a musician uses those tools, does that warrant a label? At what point does collaboration become contamination?
Maya Ackerman, an AI and computational creativity expert at Santa Clara University, notes that even with full-generative tools like Suno and Udio, users can invest tremendous creative energy into the outputs, iterating for hours on specific sounds or supplying their own lyrics. The question of what constitutes “AI music” gets messier the closer you look.
Then there’s the technical problem. Detecting AI-generated content is fraught. False positives could devastate human musicians unfairly labelled as artificial. And the detection arms race is real: AI music generation tools keep improving, forcing detection systems into constant retraining. By the time you’ve caught up to Suno 3.0, version 4.0 is probably already here.
Deezer has taken a stronger position than Spotify, tagging albums with AI content and excluding them from algorithmic recommendations. The company recently began offering its in-house detection technology to others in the industry. But even Deezer acknowledges the research challenge remains unresolved, particularly for hybrid cases where AI assists rather than dominates.
Follow the Money
Yet beneath all this technical complexity lurks something less complicated: economics.
Robert Prey, who studies streaming platforms at Oxford University’s Internet Institute, observes that Spotify is trying to optimize for platform growth by keeping recommendation systems as unencumbered as possible. Detecting and filtering AI content adds operational costs. Serving AI music, by contrast, might be cheaper.
There’s also the platform’s long-standing reputation to consider. Spotify has been accused at various points of promoting lower-cost music for background playlists. The company denies this, insisting all tracks follow the same royalty payment model based on listening share. But past controversy fuels suspicion, particularly when the financial incentives are so obvious.
David Hoffman, who studies AI music’s impact on artists’ livelihoods at Duke University, argues that platforms should at least label fully AI-generated tracks and assess the remaining problem from there. “There is a lobbying message to say ‘we can’t draw the line, and therefore we shouldn’t do anything,’” he told the BBC. That stance feels conveniently self-serving.
The Wild West Moment
We’re living through something resembling the early-2000s file-sharing panic, according to David Hesmondhalgh, a professor of media and culture at the University of Leeds. Except this time, the disruption isn’t external piracy but internal democratization. Any person with a text prompt can now generate a song. The industry didn’t see this coming at the scale it arrived.
The EU AI Act will require certain AI-generated content to carry labels starting in August 2026. That regulation might force Spotify’s hand in Europe, though how the company will implement remains unclear. Industry standards body DDEX is working on a broad standard for AI disclosures, but ultimately display depends on the platforms themselves.
Spotify recently announced new features like SongDNA and “About the Song,” which give premium users deeper insight into track origins and contributors. These feel like genuine efforts to elevate human artistry. But they’re also conspicuously not about filtering AI content or making it easy to avoid.
“We believe the right response to AI in music isn’t any single policy, it’s a combination of proactive controls, industry-wide standards, and a deeper investment in the human creativity behind every track,” a Spotify spokesperson said. That’s reasonable-sounding but ultimately evasive. It’s also precisely what a company would say when it has no intention of doing the one thing most listeners want.
The uncomfortable truth is that Spotify’s indecision isn’t really indecision at all. It’s a deliberate choice to let the chaos continue until regulation or consumer pressure makes inaction impossible. And by then, AI music will likely be so omnipresent that labelling becomes almost pointless anyway.


